Yang: Automation Is Stealing Writing Jobs

Democratic presidential candidate Andrew Yang warns that automation – including the AI-aided automation of writing jobs – is critically threatening America’s future.

#ad

“Automation doesn’t just affect millions of factory workers and truck drivers,” Yang writes in a New York Times Op-Ed.

“Bookeepers, journalists, retail and food service workers, office clerks, call center employees and even teachers also face the threat of being replaced by machines.”

Yang’s solution: Protect every U.S. adult from potential financial ruin by giving each a universal basic income of $1,000/month – courtesy of the U.S. Treasury.

In other AI-generated writing news:

*Your Journalism Job is Safe from AI, Study Finds: New research concludes that artificial intelligence is not a direct competitor to journalist’s jobs.

“Artificial intelligence does not pose a threat to professional journalism,” observe researchers Waleed Ali and Mohamed Hassoun.

Instead, AI is and will continue to be used to automate mundane newsroom tasks, freeing-up journalists for more creative endeavors, according to the researchers.

*Generally a wunderkind, AI journalism tools can sometimes go off the rails – triggering disastrous consequences, according to author Daniel Green.

“Entrusting artificial intelligence to produce articles has proved to be a huge time-saver, but it is prone to error,” observes Green, a writer for Journalism.co.uk.

Some of the bigger snafus: Incorrect football scores that generated “disastrous” headlines. And a seven-year-old who was reported to be 70-years old.

“The fact is, automated journalism is only as reliable as the information that has been plugged into it to during its development and training,” Green observes.

*Journalism and Artificial Intelligence: The News Industry Awakens to Change: While some news organizations began using AI early in the decade, adoption industry-wide is still hit-and-miss, according to Charlie Beckett.

He’s director of the Media Policy Project, sponsored by the London School of Economics and Political Science.

“The state of AI in newsrooms ranges widely from early adopters to some that are still avoiding the technology,” Beckett says.

“Most newsrooms are using AI in piecemeal ways and are only just starting to see how they can create a strategy to maximize its effectiveness,” Beckett says.

“I think the best comparison is with the early stages of social media about ten years ago,” Beckett says.

Back then, some news organizations were simply experimenting with Facebook or Twitter — while others understood that this was eventually going to change their whole business, he says.

*AI-Generated Toolmaker Offers Seamless Integration With Business Intelligence Software: Automated Insights has released an add-on module that enables its AI-generated writing software to integrate seamlessly with Tableau.

Tableau is a popular business intelligence package used by organizations to track and analyze data.

The integration enables Tableau’s graphic visualizations of data to be accompanied by easy-to-understand text explanations — thanks to Automated Insight’s Wordsmith software.

Dubbed Wordsmith Go, Automated Insights’ add-on generates text explanations from the same data that Tableau uses to create graphic visualizations.

Other popular business intelligence solutions that AI-generated writing toolmakers are creating integrations for include Microsoft Excel, Microsoft Power BI, Microstrategy, Qlik, Spotfire and Tableau.

Many of the toolmakers also offer integrations with proprietary business intelligence software.

Integrations of AI-generated writing solutions with popular business intelligence software programs are a lucrative new market for AI toolmakers.

For an in-depth look at this trend, check out, “Company Reports That Write Themselves,” by Joe Dysart.

*The Problem With Responsible AI-Generated Journalism: While many who follow AI-generated writing fret about new fake news generators like GPT-2 and Grover, there are similar risks in ‘responsible’ AI journalism tools, according to Anya Belz.

She’s an associate professor at University of Brighton, who specializes in language, data and artificial intelligence.

The danger with news generated by AI journalism is that the AI tools can be arbitrarily selective in content, according to Belz.

And they can also be biased.

Plus, news produced by respected AI-generated news services like Radar and Monok, for example, could also inadvertently generate copy that could be “shot off to any number of destinations at the touch of a button, potentially overwhelming news output, and drowning out more balanced (news) accounts,” Belz observes.

“Journalism needs to develop guidance and best practice to ensure the transparent, ethical and responsible use of AI-generated news if further erosion of concepts like truth, authorship, and responsibility for content is to be avoided,” Belz adds.

*Google’s AI-Remake of Search: Another Take: Google’s revamp of its search engine enables the tool to bring back search results based on the often long, sometimes complicated sentences that people use when querying with their voices.

According to a report in Bloomberg, the search engine facelift produces results that are much more precise.

Overall, the AI upgrade should enable Google to better understand the intent of 10% of Google searches, according to Pandu Nayak, vice president search, Google.

*AI-Generated Journalism: In Some Cases, More Objective, Study Finds: New research from the University of Miami finds that news and sports stories created with AI journalism tools are considered more credible than the same type of stories written by human journalists.

The study also found that AI-generated articles on politics were seen as more trustworthy to readers than politics coverage written by humans.

Flesh-and-blood scribes did score a win with financial news stories: Readers found those stories more believable when written by their comrades.

*What to Look for in AI-Generated Writing Software: Greg Williams, senior director, product marketing, Arria – a maker of AI-generated writing software – offers a checklist for those shopping for AI-generated writing software.

According to Williams, any decent AI-generated writing package should be:

*Open: API-based, easily integrated into disparate data sources and presentation layers

*Extensible: Narratives, conversations, lexicon, all completely driven and controlled by the business

*Smart: Thinks like a human, both linguistically and mathematically

*Secure: Supports a range of deployment models — cloud, dedicated cloud — on premises

*AI Writing Gee-Whiz of the Week: Researchers at Johns Hopkins are using the same algorithms driving AI-generated writing to diagnose liver failure.

Specifically, they used AI to analyze emails and other electronic messages written and sent by patients afflicted with liver failure, according to their new paper, published in npj Digital Medicine.

Their findings: Emails and other electronic messages sent by patients who are afflicted by liver failure exhibit telltale signs of altered language use.

Those patients tend to use shorter words in sentences than is typical, the researchers say. And they generally write sentences that are longer than usual.

Feel free to send a link to RobotWritersAI.com to a friend or colleague.

*Also on RobotWritersAI.com — Evergreen Article:

*AI-Created Newsletters: On The Cheap

Grammarly
#ad

Joe Dysart is editor of RobotWritersAI.com and a tech journalist with 20+ years experience. His work has appeared in 150+ publications, including The New York Times and the Financial Times of London.

Never Miss An Issue
Join our newsletter to be instantly updated when the latest issue of Robot Writers AI publishes
We respect your privacy. Unsubscribe at any time -- we abhor spam as much as you do.