Nothing of what you suggested is particularly difficult with real dev work. You basically just said, "I want to vibe code it all." It's trivially easy to set up pseudorandom generators; deciding where enemies and objects go *should not* be left up to chance through some black-box algorithmic "magic." Game theory exists for a reason, and AI doesn't "know" about it, because it's just a complex pattern generator at the end of the day.
Also, what happens when the model generates an environment that can't be traversed? What if it places invisible walls in weird places? What about an environment that's rife with bugs? What if the code is plain wrong? Now you have to go into the code, learn how it works, and debug it manually. Thank god you saved yourself some time by vibe coding. /s
I can see we won't agree, so you're welcome to get the last word, but I won't reply afterwards.
T
telorand@reddthat.com
@telorand@reddthat.com
A forum for discussing and organizing recreational softball and baseball games and leagues in the greater Halifax area.
Posts
-
90% of Games Developers Already Using AI in Workflows, According to New Google Cloud Research -
90% of Games Developers Already Using AI in Workflows, According to New Google Cloud ResearchLLMs and other machine learning are just algorithms. That's all procedural world generation is, and this insistence by the Tech Bros that we need their models to "boost creativity" is a farce. My opinion? The people that wish they could use AI to "see what they get out of it" are lazy ass fucks who don't want to put in the extra work to actually get good at game dev. -
90% of Games Developers Already Using AI in Workflows, According to New Google Cloud Research>This survey was conducted online by The Harris Poll on behalf of Google Cloud from June 20, 2025 - July 9, 2025 among 615 adults aged 18+ working in game development in the United States, South Korea, Finland, Norway, and Sweden. So, it's a voluntary poll. That's a great way to get a biased sample. Also, some of the responses sound like Google is playing fast and loose with the term "AI." Is procedural world generation AI? Google seems to think so, despite it existing long before LLMs we're a thing. This whole thing reads like "research" designed to promote AI. I wonder why Google might want that? /s