Quantcast
Channel: Cadence Blogs
Viewing all articles
Browse latest Browse all 6662

Some Headlines that Caught My Eye

$
0
0
Look around, look around at how lucky we are to be alive right now… —Lin-Manuel Miranda, Hamilton Every once in a while, I write a post like this one, to highlight some of the amazing things that are happening in the world of electronics and technology. Today I culled through some recent headlines that set my mind spinning. Think about how they might affect how Cadence products are used! Electronic Stickers Monitor Health Researchers at Purdue University have developed extremely inexpensive wearable electronic devices that can monitor health metrics in real time. These devices are made of cellulose—paper—and adhere to the skin and are thin and stretchable, making them imperceptible for the wearer. “For the first time, we have created wearable electronic devices that someone can easily attach to their skin and are made out of paper to lower the cost of personalized medicine,” said Ramses Martinez, a Purdue assistant professor of industrial engineering and biomedical engineering, who led the research team. They can also be implanted in a patient because they conform to internal organs without causing adverse reactions. These stickers are coated with molecules that repel water, oil, dust, and bacteria. Each sticker costs about a nickel to produce and can be made using printing and manufacturing technologies like those used to print books at high speed. The possibilities for this technology are extensive, from monitoring the status of your heart after a cardiac event, monitoring the sleep of patients, even use by athletes to monitor their health while exercising—even while swimming. Can you imagine a future where, the first thing that happens when you go to an emergency room is getting a sticker slapped on your wrist to monitor your vitals, replacing being hooked up to bulky, expensive, and intrusive monitors? The implications for triage purposes alone is significant. How about attaching a sticker after you have had major surgery, to notify you—and your doctor—in real time if your vitals change, so you can go home sooner? Purdue is looking for partners to test and commercialize their technology. At five cents a pop, it seems a no-brainer to develop this technology further. Sources: https://pubs.acs.org/doi/10.1021/acsami.8b11020 https://www.sciencedaily.com/releases/2018/10/181016132009.htm Processing Abstract Thought as Revealed by Artificial Intelligence This one is a bit more… well… abstract. Cameron Buckner, assistant professor of philosophy at the University of Houston, has published a paper titled, “Empiricism without magic: transformational abstraction in deep convolutional neural networks,” in the journal Synthese . What does this mean? The paper posits that deep convolutional neural networks, or DCNNs, suggest human knowledge stems from experience, a school of thought known as empiricism. The success of these networks at complex tasks involving perception and discrimination has at times outpaced the ability of scientists to understand how they work. As Science Daily reports, While some scientists who build neural network systems have referenced the thinking of British philosopher John Locke and other influential theorists, their focus has been on results rather than understanding how the networks intersect with traditional philosophical accounts of human cognition. Buckner set out to fill that void, considering the use of AI for abstract reasoning, ranging from strategy games to visual recognition of chairs, artwork and animals, tasks that are surprisingly complex considering the many potential variations in vantage point, color, style and other detail. The DCNNs have also answered another lingering question about abstract reasoning, Buckner says. Empiricists from Aristotle to Locke have appealed to a faculty of abstraction to complete their explanations of how the mind works, but until now, there hasn’t been a good explanation for how. “For the first time, DCNNs help us to understand how this faculty actually works,” Buckner says. Less than a decade ago, he says, scientists believed advances in machine learning would stop short of the ability to produce abstract knowledge. Now that machines are beating humans at strategic games, driverless cars are being tested around the world and facial recognition systems are deployed everywhere from cell phones to airports, finding answers has become more urgent. “These systems succeed where others failed,” he says, “because they can acquire the kind of subtle, abstract, intuitive knowledge of the world that comes automatically to humans but has until now proven impossible to program into computers.” The issue that I have with this line of reasoning is that there are different ways of “knowing”, some of which are profoundly human. For example, I can “know” a truth as it speaks to me through a piece of art, which is separate from empiricism. For example, looking at the mural Guernica tells of the horrors of war in a way that I can understand it on some level, even though I didn’t experience the Spanish Civil War directly—and neither did Pablo Picasso, who was living in Paris when the town of Guernica was bombed. His "empirical" experience of the aftermath was based on news reports. And yet, the painting reveals a non-empirical truth. Guernica, by Pablo Picasso, 1937 (Interesting fact: While Picasso was living in Nazi-occupied Paris during World War II, one German officer allegedly asked him, upon seeing a photo of Guernica in his apartment, "Did you do that?" Picasso responded, "No, you did.") Now, we’re getting into realms of philosophy that are beyond my ken, but I’m interested in the overlap between AI and epistemology. When does a DCNN “know” something? When does a human? How do either of us “know”? It seems that this paper is attempting to answer that question. Sources https://link.springer.com/article/10.1007%2Fs11229-018-01949-1 https://www.sciencedaily.com/releases/2018/10/181009115022.htm https://www.theguardian.com/books/2005/jan/08/highereducation.news Other Articles that Caught My Eye So much is going on in the world, it’s hard to narrow things down. Here’s a smattering of some more articles that caught my eye of late: Computer Stories: A.I. Is Beginning to Assist Novelists How Autonomous Vehicles Will Reshape Our World How Driverless Cars Are Going to Change Cities (check out the neat graphic about halfway through) Researchers quickly harvest 2-D materials, bringing them closer to commercialization Robots at Work and Play (a photo essay) Ultra-light gloves let users 'touch' virtual objects —Meera

Viewing all articles
Browse latest Browse all 6662

Trending Articles