George Rebane
The progress toward the Singularity is producing some good and bad news. The bad news is that the newer Large Language Models are taking ever more eye-popping resources in the hardware – server farms, data storage, electric power to operate, … – and just the cost of training the new LLMs which now may take months. The other problem is that we are running out of stored data on which the LLMs munch to develop their smarts. (more here)
The good news is that the new generative AI LLMs are taking over more are more functional areas in corporate environments – e.g. IT functions such as customer service, procurement, human resources and even software development. Under the high level supervision of one human, an AI can replace scores of other humans in an ever increasing number of jobs. (more here)
And an example of one function is how an AI can ‘read’ a collection of information about a subject area and produce a totally realistic and relevant conversation between two ‘people’ analyzing the contents of the information and even derive deeper meanings from the input material. To illustrate this capability, my son-in-law used some new AI published by Google to create a podcast-like conversation (here) about yours truly from having read only my RR posts from the beginning of 2024. I was blown away by not only its apparent authenticity (yes, they made a couple minor mistakes), but also by the natural sounding and appropriate interstitial phrases and observations that were part of the conversation by the two virtual podcasters. Without looking behind the curtain, this conversation would fool (almost?) everyone as being between two actual humans.


Leave a comment