Is It Even Vibe Coding At This Point? (ChatGPT-5 Builds Me An App)

So as is obvious from recent activity around here I’ve been playing around with AI a lot, and downloading all kinds of models and addons. It was becoming a real mess and what a model does isn’t always obvious from the name. So for the past few days I’ve been telling myself I need to start documenting this stuff. First I just started with a text file but realized more structure would be good. So I thought of Excel or Google Sheets but I HATE typing data into spreadsheets. Next I thought I’d whip up a little php web app and host it tucked away here somewhere.

Then I thought… what a great time to test the new ChatGPT-5 that came out this week.

My Prompt:

I’d like you to create a little application for me to store info on my AI LLM model files. Please create it in Python with a user interface. Data can be stored in json or any other no-sql system. For each model I’d like to collect the following info, and I’m open to other suggestions if you have them. But here’s the list I have:

Type (checkpoint, lora, etc)
Base Model
URL
Description of what it is used for

Can you build me that?

A few seconds later, the response:

Heck yes, Pete — let’s spin up a tiny, no-fuss desktop app to track your model zoo. I built you a self-contained Python/Tkinter app that:

* Stores everything in a portable `models.json` (no-SQL vibes, zero setup).
* Gives you a clean UI to add/edit/delete entries.
* Lets you search/filter across all fields.
* Imports/exports JSON.
* Includes your core fields (Type, Base Model, URL, Description) plus optional nerdy metadata you’ll probably want later (tags, local path, license, format, quantization, checksum, VRAM need, etc.).

### Grab the files

* [Download `model_catalog.py`](sandbox:/mnt/data/model_catalog.py)
* [Download a starter `models.json`](sandbox:/mnt/data/models.json)

### How to run

1. Save both files to the same folder.
2. Run:

“`bash
python model_catalog.py
“`

That opens the GUI. It reads/writes `models.json` next to the script.

### How it works (quick tour)

* **Basic fields** at the top: Type, Base Model, URL, Description.
* **Optional metadata** (toggle in View → “Show Optional Fields”): tags, local path, license, format (GGUF/safetensors/ckpt), quantization (Q4/INT8/etc.), source (HuggingFace/Civitai), checksum, VRAM requirement, compatible engines, notes, etc.
* **Search** box filters the left-hand list live across all fields.
* **Open URL** button opens the selected model’s link (or whatever’s in the URL field).
* **Import/Export JSON** under File menu or buttons. Import merges by ID (no dupes).

### Design notes / why this setup

* **Tkinter** keeps it dependency-free. No Streamlit/Flask needed.
* **JSON** is human-editable, easy to version-control, and portable.
* **Schema-flexible**: you can ignore optional fields now and start filling them as your catalog grows. Future-you will thank past-you when you’re asking “which LoRAs were INT8 and ran on 8 GB VRAM?”

If you want extras next: checksum auto-calculation from a local file, drag-drop of files to populate `local_path`, or bulk metadata editing. I can also generate a CLI export that prints a neat table for quick terminal use, or a web UI version if you prefer browser-based vibes.

Annnnndddd…. it worked perfectly. Once I started using it I did go back to ask for more features, but I didn’t type a single line of code. In fact in honor of GPT’s personality I named it “Model Zoo”.

It’s not the sexiest thing you’ll ever see but it is doing exactly what I need it to do and really nothing else. I could have just asked for an HTML/CSS/Javascript version and it might have been prettier but I have SO many tabs open these days. I just wanted something stand alone. The data is just stored as a json file so we’ll see if it runs well once it gets super full (if it ever gets super full) but for now. Well that was like 10-15 minutes or work. The first iteration was under a minute, though.

So my job is going to be extinct soon, at least in its present form. Hopefully when company’s switch from hiring coders to hiring prompters who can validate code, I’ll be ready to make the jump!

If for any reason anyone wants the python file, give a shout.

[Cover image generated at venice.ai using the Flux model and the prompt (itself AI enhanced by Venice) A high-tech android with sleek, metallic skin and glowing blue eyes, meticulously typing code on a futuristic laptop with a transparent, holographic interface, surrounded by floating data streams and glowing circuit patterns in a dimly lit, high-tech laboratory environment with sleek, minimalist furniture and ambient LED lighting, 16:9 aspect ratio]