There’s so much to be seen here that -- until somewhat recently -- was fairly unheard of. And we don’t know what’s good or bad. It’s as if we’re constantly coming across a new cast of characters and are forced to ask, “Are you a good witch, or a bad witch?”
Replace the word "witch" with "bot," and you might be summing up the modern digital landscape. There's a lot of talk about AI, but it can be confusing. Is it helpful, or harmful? Is it going to make us better at our jobs, or take them away from us? And these bots of which we're constantly speaking -- which are good, and which are bad? New Call-to-action
As it turns out, there are ways of distinguishing them. It requires a bit of a discerning eye, but you certainly don't need to be an expert -- you just need the right information. So, without further ado, allow us to present our tips for distinguishing good bots from bad bots.
These bots search the web for content that’s potentially been plagiarized. Think: Illegal uploads, copying someone else’s work without proper attribution, or other improper use of proprietary content. According to the Electronic Frontier Foundation, these bots are commonly used within the realm of social media, especially where original content creation is a major part of the platform’s use. One prime example is YouTube’s Content ID, which is assigned to copyright owners on the network.
According to eZanga, data bots are those that provide up-to-the-minute information on things like news, weather, and currency rates. With that criteria, tools like Amazon Echo, Google Home, and Siri could be classified as data bots -- especially since eZanga also calls these “media” bots. However, one technology developer, Botler, classifies one of its products as a data bot -- “a new way to quickly store and access info that is important.” Its primary use, it appears, is for the academic sector, as it allows course information to be easily shared between students and faculty.
Think about what a spider does -- it crawls. Search engines do the same thing, by crawling the web’s content to produce query results, and using spider bots to do so. Google, for example, has its very own Googlebot, which uses the constantly-evolving Google algorithm to determine which sites to crawl.
These days, spider bots aren’t limited to search engines. The Siemens Robotics Lab, for example, has developed spider-shaped robots that combine the ability to autonomously perform physical tasks with information-crawling capabilities. How does that work, exactly? Siemens Research Scientist Hasan Sinan Bank explains:
The robots use onboard cameras as well as a laser scanner to interpret their immediate environment. Knowing the range of its 3D-printer arm, each robot autonomously works out which part of an area – regardless of whether the area is flat or curved – it can cover, while other robots use the same technique to cover adjacent areas.”