AI and the Future

Daniel Duffy

C++ author, trainer
Joined
10/4/07
Messages
10,434
Points
648

The company hasn't detailed why it fired Chatterjee, but told the Times he'd been "terminated with cause." It also maintained that the original paper had been "thoroughly vetted" and peer-reviewed, and that the study challenging the claims "did not meet our standards."
 
What would it take for artificial intelligence to make real progress?



Geoffrey Hinton, “Godfather” of deep learning, and one of the most celebrated scientists of our time, told a leading AI conference in Toronto in 2016. “If you work as a radiologist you’re like the coyote that’s already over the edge of the cliff but hasn’t looked down.” Deep learning is so well-suited to reading images from MRIs and CT scans, he reasoned, that people should “stop training radiologists now” and that it’s “just completely obvious within five years deep learning is going to do better.”

Fast forward to 2022, and not a single radiologist has been replaced. Rather, the consensus view nowadays is that machine learning for radiology is harder than it looks1; at least for now, humans and machines complement each other’s strengths.2


Even I could have told you that.
 
The author of the article Gary Marcus had an interesting twitter beef w/ yann lecun recently:

1652239540172.webp
 
Hugely embarassing and counter-productive for AI as a technology.
I hate all those buzzwords.
You would expect thsese "emiinent" computer scientists to behave better.
 
CAREFUL NOW

AI-generated answers temporarily banned on coding Q&A site Stack Overflow


“The primary problem is that while the answers which ChatGPT produces have a high rate of being incorrect, they typically look like they might be good and the answers are very easy to produce,”


But while many users have been impressed by ChatGPT’s capabilities, others have noted its persistent tendency to generate plausible but false responses. Ask the bot to write a biography of a public figure, for example, and it may well insert incorrect biographical data with complete confidence. Ask it to explain how to program software for a specific function and it can similarly produce believable but ultimately incorrect code.
 
In response to OpenAI CEO Sam Altman's suggestion that "a new version of Moore’s law that could start soon: the amount of intelligence in the universe doubles every 18 months," NYU professor and AI expert Gary Marcus said, "A new version of Moore’s law that has arguably already started: the amount of hype around AI doubles every 18 months."
 
The deeper problems as I see them are

1. Deskilling
2. Lack of discipline
3. Not real products in many cases, more like "cargo cult" programs.
4. Data Science doesn't have enough maths.
5. What happened to the "artisan expert programmer"?

I have seen this evolution for some (extended) time now.


 
Last edited:
The deeper problems as I see them are

1. Deskilling
2. Lack of discipline
3. Not real products in many cases, more like "cargo cult" programs.
4. Data Science doesn't have enough maths.
5. What happened to the "artisan expert programmer"?

I have had seen this evolution for some (extended) time now.


The number of scientists in the data-science community are fewer. Personally, some of my friends know tooling, and basic algorithms well. They can perform the job. But, they lack a deeper understanding.
 
Back
Top Bottom