In February last year, San Francisco’s art lovers were treated to a new kind of exhibition. Titled, “DeepDream: The Art of Neural Networks” and held in the trendy Mission District, the art on display was otherworldly, strange, and psychedelic. You can read more about it here . The event was a joint effort between Google and the Gray Area Foundation, an organization dedicated to fostering collaboration between the Arts and Technology. If you’re wondering what Google had to do with this, it’s because the art displayed were a product of Google’s revolutionary DeepDream software. First developed in Google’s Zurich office in 2014, DeepDream uses neural networks, a set of complex mathematical algorithms that allow machines to learn specific behavior by analyzing reams of data in search of patterns. These neural networks study millions of images and generate unique artwork by accentuating shapes and objects that aren’t there but that it has learnt through studying patterns. Artworks designed with DeepDream raised almost $100,000 for the foundation at the exhibition. Recently, The Economic Times carried an article about a survey conducted by Adobe, a company famous for its image editing software Photoshop. Adobe surveyed over 5,000 professionals across the Asia Pacific region, and found 27% of Indian respondents were “extremely concerned about the impact of these new technologies.” Those surveyed were fearful that artificial intelligence (AI) would take over their jobs, a concern apparently not shared by the rest of Asia, over 50% of whom were unconcerned by AI or machine learning. India’s fears in this respect are probably unfounded, however. According to Kulmeet Bawa, Managing Director, Adobe South Asia, "While AI and machine learning provide an opportunity to automate processes and save creative professionals from day-to-day production, it is not a replacement to the role of creativity." AI and Music The same can be said of AI’s impact on music. Another of Google’s efforts is Project Magenta, the brainchild of Douglas Eck, a Google employee and amateur musician. According to Eck, the idea with technology is not to replace artists, but to create tools that enable artists to unleash their creativity in new unexpected ways. To that end, an offshoot of Project Magenta, Google’s “Nsynth” team has been focused on creating new musical instruments. By feeding unique sonic characteristics from hundreds of instruments into a neural network, they’re able to combine these instruments to create entirely new ones. Both DeepDream and Nsynth are designed to create tools that extend the possibilities for artists. As Adobe’s Kulmeet Bawa told the Economic Times, “It provides more levy for creatives to spend their time focusing on what they do best -- being creative, scaling their ideas and allowing them time to focus on ideation and creativity." In other words, while AI can never replace Anjolie Ela Menon or A R Rehman, it might just replace the brush and guitar.
↧