jcc
New Reader
1/12/25 3:04 p.m.
The OP is assuming that AI is basically not smart enough to sort out the BS, and I agree currently it likely is not, but a useful telltale for future AI IMO will be by building a database of self declared anarchists or outliers from a large enough database to simply ignore them, unfortunately. This thread for example would be a good starting point for AI to build their database, no matter how valid our concerns here that are shared.
The real problem as I see it, the majority lean towards being willing sheep for whatever reason in the big scheme of things, and that will never change.
First thing that comes to mind is that boy is the AI going to be pissed when it figures it out.
A lot of it is in how you write the prompts to get good information out of AI.
And it's better to go one item at a time, in an iterative fashion as opposed to dumping multiple questions into one prompt.
jcc
New Reader
1/13/25 10:41 a.m.
In reply to z31maniac :
I know some people that also share that same sophistication level.
Duke said:
In about another year of the current technology we're going to see an epidemic of AI Mad Cow disease. It's already happening regularly, but soon it will be pervasive.
I'm going to be disappointed if it takes a year. I think with leveraging the efficiency of AI we can get that down to 6 months.
Seriously, try limited your search results in your engine of choice to "prior to 2024" and note the quality of your query over the modern stuff. It took 2 years to turn the internet into what humanity took 200+ years of industrialization to do the environment. So it's already been way more efficient than humans in that aspect at least!
WonkoTheSane said:
Duke said:
In about another year of the current technology we're going to see an epidemic of AI Mad Cow disease. It's already happening regularly, but soon it will be pervasive.
I'm going to be disappointed if it takes a year. I think with leveraging the efficiency of AI we can get that down to 6 months.
If there's something nerds love to do, it's figure out how to break the unbreakable.
TJL (Forum Supporter) said:
My fist thought was when people had fun with Microsofts "tay".
https://en.m.wikipedia.org/wiki/Tay_(chatbot)
Humans are very very good at screwing up AI datasets already, a large portion of large model AI generation is that its a crap ton of data and non of it has been sanitized. I had something running on my PIhole that would just go to random parts of the web and poke around. Still the occasional ad got through but nothing was custom for me.
Right now scraping the raw nonsense that humans voluminously spit out, on the web, is just going to train a AI model to act like us idiots. Not better.
I fail to understand the hype on this topic. The data quality question is just one of the issues. From my experience people tend to think any data coming from a computer is perfect. Far from that. But as time goes on the geeks will run out of things to charge their time against and will go after this one. Problem is once they fix it all the old data needs to be run thru the fix or old crap is mixed in with "good" stuff. Then they fine the "new" good data cleanser and we start over again.
Lets remember this latest shiny coin is being created by the same types who wrote windows 3.1, vista, etc. They never get it perfect, we buy it anyway and pay their crazy salaries only to pay them again for the next upgrade.
porschenut said:
I fail to understand the hype on this topic. The data quality question is just one of the issues. From my experience people tend to think any data coming from a computer is perfect. Far from that. But as time goes on the geeks will run out of things to charge their time against and will go after this one. Problem is once they fix it all the old data needs to be run thru the fix or old crap is mixed in with "good" stuff. Then they fine the "new" good data cleanser and we start over again.
Lets remember this latest shiny coin is being created by the same types who wrote windows 3.1, vista, etc. They never get it perfect, we buy it anyway and pay their crazy salaries only to pay them again for the next upgrade.
Part of this is mixing up AI and machine learning. Machine learning done on good data with human eyes to interpret and confirm it is astonishing. Almost unlimited applications but super unsexy and time consuming to get it setup and working. this is for the niormal off the street person and their understanding.
AI is not a fad but the processing power required and the complexity of dealing with data from billions of sources is not solved yet and needs a ton more CPU/GPU power thrown at it with all the environmental problems that entails.
In reply to wearymicrobe :
I think the only reason energy is getting guzzled like that is because it's just easier, and in the short term faster, than switching to more specialized hardware that would use less. Every AI company now thinks they're in a sprint race to be the company that makes a big chunk of the workforce obsolete, and nobody wants to pit for better tires.
GameboyRMH said:
In reply to wearymicrobe :
I think the only reason energy is getting guzzled like that is because it's just easier, and in the short term faster, than switching to more specialized hardware that would use less. Every AI company now thinks they're in a sprint race to be the company that makes a big chunk of the workforce obsolete, and nobody wants to pit for better tires.
Tons of money in the chip space is spent on reducing total power costs. If it was possible then NVIDIA would be doing it. Rack density and running costs are highly scrutinized. Big if, but if you can do all the calcinations on a server somewhere it's always going to be more energy efficient then small distributed nodes in individual devices which some companies are pushing hard.
But right now I swear every major IT group is going back to internal servers over cloud so who knows what the winds will bring.
In reply to wearymicrobe :
Nvidia wants the AI industry to continue using CUDA which is only practical or at least only competitively efficient on Nvidia's GPUs. That's their moat. Switching to something like Cerebras chips for training would take time and effort for the AI companies, time they believe could cost them the race to becoming the monopolistic AI megacorp of the coming post-human-labor dystopia, and effort that could've been put toward the immediate race effort.
On cloud repatriation I've seen stats that around 5% of the industry is doing it, certainly not even 10. I sure wouldn't mind getting a job with one of those companies and never having to hear about AWS or Azure again.