Teaching robots how to write

Teaching robots how to write

Reporters at Aftonbladet have been producing articles for close to 188 years. Now comes one of the biggest revolutions in the history of Schibsted. From now on robots will write and offer readers hundreds of articles every day. Sweden’s biggest media house is increasing its service to readers with this latest technology.

A text robot will be producing content for the coverage of traffic and weather in addition to creating texts for the football coverage in the section Sportbladet. This project is carried out in cooperation with United Robots, which is leading in artificial intelligence (AI) and natural language generation (NLG) in producing publishable articles using large data sources.

The new service will work as a complement to the traditional journalistic work and will create content to an extent never reached before at Aftonbladet. The automated writing of texts means that there is no limit to how many texts can be produced in a day. Furthermore, the automatically written articles can mean more time for the paper’s reporters to concentrate on the biggest and most important stories as the robot deals with more mundane matters.

The first section to use the new technique will be the reporting on traffic and weather together with football pieces. The traffic robot will produce articles about situations and warnings from roads all over the country and report estimated delays that follow. Together with the texts, the robot will send a satellite picture highlighting trouble spots.

Taught by journalists

But how can the robot tell us what is happening on the roads of Sweden?

This is how it works:

  • An incident or a blockage occurs.
  • The state-run Swedish Transport Administration (Trafikverket) sends data to the robot containing facts such as type of incident, number of vehicles involved, GPS coordinates and at what time the road is estimated to be clear again.
  • The robot creates a text using the wording it has been “taught” by Afton­bladet journalists to sound as natural as possible. In order for the texts to be varied, the robot can describe the same incident in several different ways.
  • The text is filed to Aftonbladet and is automatically published on aftonbladet.se. When there is a major incident, a push notification can be automatically sent.

As for the weather reporting, the service is based on warnings issued by the Swedish Meteorological and Hydrological Institute (SMHI). With the help of the robot service, Aftonbladet will be able to provide more content and a wider coverage. That means better opportunities to personalize products and, with geo positioning, increasing possibility of giving readers relevant information. The robot can also facilitate the work in the newsroom by issuing alerts drawing attention to major events such as accidents with several vehicles involved. Algorithms will analyze all the available data looking for deviations, connections and events that could lead to a wide coverage and alert the newsroom.

When the whistle blows

Text robots were used already in the spring of 2018 as Sportbladet increased the depth and width of its Premiere League reporting. Traditionally, the coverage in Sportbladet has consisted of live reporting from matches together with texts, mainly about the top teams. With the help of the robot, and with its global sports data source Sportradar, Sportbladet can now publish articles at the moment the referee blows the final whistle – from every game that has been played.

Together with TV highlights from Viasat and match facts from Sportbladet’s goal service, it gives the reader the chance to read about his or her team’s match instantly. “With all the deep data we have from major leagues such as Premiere League, the robot knows a lot about what has happened during the match”, says Sports Editor Pontus Carlgren. “That, together with the speed a robot offers, gives our coverage a new dimension.

Thanks to the possibility to subscribe to texts through the follow function on the site, supporters won’t miss a single article about their team”, Carlgren says, adding that “The fact that the robot produces texts that are finished at the final blow of the whistle,will give our knowledgeable reporters and columnists more time to work on thought-through articles and columns for the benefit of our readers.”


How to make friends with robots

How to make friends with robots

Imagine. It’s 2025, and you’re slowly waking up. The robot next to your bed is beaming artificial sunlight in sync with the natural sunlight outside your curtains. It’s projecting today’s most important headlines on the ceiling above your bed and as your eyes flick through the news, you hear the shower turning on, coffee dripping and the stove working its magic.

A perfectly suited outfit is laid out on your bed. Your robot hands you your coffee and breakfast tailored to your flavor. A driverless car swings up, ready to take you to work. But perhaps you skip the ride, and rather put your VR headset on and lean back as a 3D digital office unfolds in front of your eyes. If you’re a sci-fi geek, none of these scenarios will seem particularly far-out. But how likely are we to experience a future in which these scenarios no longer belong to fiction, but are as a natural part of our day as smartphones? And is this a future we want?

“What kind of future do we want?”

Japan has always been a frontrunner in technology. Erica, a semi-autonomous android, resembles a human; modeling a plastic skull and silicone skin with wires connecting her to AI software systems that bring her to life. She is the result of one of the most funded scientific projects in Japan, and a collaboration between Osaka and Kyoto universities and the Advanced Telecommunications Research Institute International. Her creators, Hiroshi Ishiguro and Dylan Glas, claim Erica is “living” proof that the future might be closer than we think. Using sensors and microphones, she is able to pick up what people are saying to her, and through AI software, she can respond convincingly. Erica works as a receptionist at their laboratory.

It makes sense to emulate humans

The tech-industry knows that humans are inherently wired to recognize and interact with other humans. So from a behavioral adoption point of view, it makes sense to develop robots emulating the human body. They do rely on our adaptation – fueling their machine learning minds in order to make them smarter – or more human. And the only way to achieve this would be to stimulate frequent, continuous and human-like interaction over time. Research also shows that we tend to find robots endearing as long as they resemble something non-human, like animals. The ­moment we encounter robots that resemble humans we find them troubling.

That said, we are capable of developing feelings for robots. Multiple studies have been conducted, in which the results clearly imply that we subconsciously start treating machines around us like social beings – ­given the right context. Several studies have shown participants feeling grief and remorse when either requested to abuse or witness a robot being abused. In one study observers could watch how participants found it nearly impossible to push a button that indirectly would destruct a friendly-looking robot if it misbehaved, while it begged to be forgiven. Technology has consistently functioned as a catalyst to a more efficient, healthy and sustainable world. For example, in the healthcare sector we’re seeing huge advances with life-saving techniques leveraging AI to aid doctors with disease diagnosis.

It ‘s up to us

Robots, created in both hardware and software, will inevitably shape our future. How we let them do that, is up to us. Robots of the kind depicted by Erica will not be ubiquitous in the next few years. To give you a picture of the magnitude of complexity required to design the interplay between the software and hardware that keeps a bot on its feet, it typically takes more than 500,000 lines of code to put one foot in front of the other. Then again, this is an ability that took hundreds and thousands of years of human evolution for humans to perfect.

Perhaps our concerns with humanoid robots will never materialize. Considering the pace at which humans are integrating closer with technology, through enhancements of the sorts like BCI (Brain Computer Interfaces), exoskeletons and eventually nanobots streaming through our body, maybe the question should not be whether robots should resemble humans. Perhaps the integration of human and technology will redefine what it means to be human. Together as consumers we have more power than robotic scientists. Technology has to adapt to us, because we can’t adapt to it fast enough. At this point in time, we should be asking ourselves what kind of future we want. And the most important question remains to be answered: Which ethical standards, values and goals do we choose to pass on to our mechanical companions?