I'll just say, it's ok to not know, but saying 'obviously' when you in fact have no clue is a bad look. I think it's a good moment to reflect on how over confident we can be on the internet, especially about incredibly complex topics that cross into multiple disciplines and touch multiple fields.
To answer your question. The model is in fact run entirely locally. But the model doesn't have all of the data. The model is the output of the processed training data, kind of like how a math expression 1 + 2 has more data than its output '3' the resulting model is orders of magnitude smaller.
The model consists of a bunch of variables, like knobs on panel, and the training process is turning the knobs, the knobs themselves are not that big, but they require a lot of information to know where to be turned too.
Not having access to the dataset is ok from a privacy standpoint, even if you don't know how the data was used or where it was obtained from, the important aspect here is that your prompts are not being transmitted anywhere, because the model is being used locally.
In short using the model and training the model are very different tasks.
Edit: additionally, it's actually very very easy to know if a piece of software running on hardware you own, is contacting specific servers. The packet has to leave your computer and your router has to tell it to go somewhere, you can just watch it. I advise you check out a piece of software called Wireshark.
You kind of get it, it's not really a dictionary, it's more like a set of steps to transform noise that is tinted with your data, into more coherent data. Pass this input through a series of valves that are all open a different amount.
If we set the valves just perfectly, the output will kind of look like what we want it to.
Yes, LLMs are prone to hallucinations, which isn't always actually a bad thing, it's only bad if you are trying to do things that you need 100% accuracy for, like specific math.
I recommend 3blue1browns videos on LLMs for a nice introduction into how they actually work.