There is no secret that Wikipedia relies on the activity of bots to deal with the high volume of information on a daily basis. Without the machine’s help, the largest online encyclopedia might not be able to remain in control of its content. However, as these bots are using artificial intelligence, developers couldn’t track their evolution too thoroughly. A recent paper did study their behavior, and it actually found that these bots are waging edit wars with one another for years.
A new study was published in PLOS ONE, and it discovered something odd. Wikipedia has employed bots for years with a focus on improving the massive content of its online encyclopedia. Their function is that of mitigating vandalism, applying bans, checking grammar, importing content automatically, and many other editing tasks. On the other hand, there is a second type of bots that were created for non-editing purposes such as identifying copyright infringements, mining data, and identifying data.
Thus, thousands of bots have their own set of tasks that they must carry out. However, once in a while, these tasks can overlap each other. Each bot has to perform a specific task that can be in contradiction to the purpose of another bot. Once these edit wars happen, they start undoing and redoing the tasks until forever. They have no protocol through which they can communicate with one another.
To understand how Wikipedia bots evolved in time, a team of computer scientists from the Oxford Internet Institute and the Allan Turing Institute chose same pages in 13 different languages. Within this content, researchers focused on how machines chose to make their edits over a period of ten years starting with 2001. They made sure no human editor changed anything from the selected pages.
This is how scientists managed to track the interactions between bots. An odd observation was that each bot interacted differently within certain cultural environments. The Portuguese pages experienced the most edit wars which reached a total of 185. The English versions felt an average of 105 redos. The German changes were the most indulgent, with an average of only 24 revisions.
Taha Yaseri, the co-author of the paper, stated that it was obvious that same technology can evolve in different ways depending on its cultural environment. This warns people to be more careful when they create their bots. As these AI machines have a certain background, once they encounter other such bots they may engage in perpetual edit wars or any other kinds of conflict. Thus, the future of robotics has to have a tracking list of all such software across the world.
Image source: 1