When an AI Tries Writing Slashdot Headlines ( 159

For Slashdot's 20th anniversary, "What could be geekier than celebrating with the help of an open-source neural network?" Neural network hobbyist Janelle Shane has already used machine learning to generate names for paint colors, guinea pigs, heavy metal bands, and even craft beers, she explains on her blog. "Slashdot sent me a list of all the headlines they've ever run, over 162,000 in all, and asked me to train a neural network to try to generate more." Could she distill 20 years of news -- all of humanity's greatest technological advancements -- down to a few quintessential words?

She trained it separately on the first decade of Slashdot headlines -- 1997 through 2007 -- as well as the second decade from 2008 to the present, and then re-ran the entire experiment using the whole collection of every headline from the last 20 years. Among the remarkable machine-generated headlines?
  • Microsoft To Develop Programming Law
  • More Pong Users for Kernel Project
  • New Company Revises Super-Things For Problems
  • Steve Jobs To Be Good

But that was just the beginning...


'Maybe Wikipedia Readers Shouldn't Need Science Degrees To Digest Articles About Basic Topics' ( 304

Wikipedia articles about "hard science" (physics, biology, chemistry) topics are really mostly written for other scientists, writes Michael Byrne, a reporter on Science beat at Vice's Motherboard news outlet. From the article: This particular class of Wikipedia article tends to take the high-level form of a scientific paper. There's a brief intro (an abstract) that is kinda-sorta comprehensible, but then the article immediately degenerates into jargon and equations. Take, for example, the page for the electroweak interaction in particle physics. This is a topic of potentially broad interest; its formulation won a trio of physicists the Nobel Prize in 1979. Generally, it has to do with a fundamental linkage between two of the four fundamental forces of the universe, electromagnetism and the weak force. The Wikipedia article for the electroweak force consists of a two-paragraph introduction that basically just says what I said above plus some fairly intimidating technical context. The rest of the article is almost entirely gnarly math equations. I have no idea who the article exists for because I'm not sure that person actually exists: someone with enough knowledge to comprehend dense physics formulations that doesn't also already understand the electroweak interaction or that doesn't already have, like, access to a textbook about it. For another, somewhat different example, look at the article for graphene. Graphene is, of course, an endlessly hyped superstrong supermaterial. It's in the news constantly. The article isn't just a bunch of math equations, but it's also not much more penetrable for a reader without at least some chemistry/materials science background.

Why Is There No Nobel Prize In Technology? ( 148

An anonymous reader quotes a report from Quartz: As the world focuses its attention on this year's recipients of the planet's most prestigious prize, the Nobel, it feels like something's missing from the list: technology. Swedish inventor Alfred Nobel established the prizes more than century ago with the instruction that his entire estate be used to endow "prizes to those who, during the preceding year, shall have conferred the greatest benefit to mankind." The categories laid out in his will -- physics, chemistry, physiology or medicine, and peace -- have remained the basis of the awards, and a prize for economics was added in 1968. So, what gives? Why only those five original fields? Nobel didn't say, revealing only that he made his choices "after mature deliberation."

One way of looking at it is that when he was designing his categories, he wanted the prizes to only reflect advances in fundamental science. In this view, "lesser" sciences such as biology, geology, or computer science -- or technology-driven fields such as engineering or robotics -- don't qualify. As genome-sequencing pioneer Eric Lander once said, "You don't get a Nobel Prize for turning a crank." But what then of literature and peace, or the newer prize for economics (an applied science at best, and a pseudoscience at worst)? Technology isn't the only field to get the cold shoulder. Mathematics -- the international language, the foundation of so many scientific pursuits, and arguably the most fundamental theoretical discipline of all -- doesn't have a Nobel Prize, either. Mathematicians have complained about this for decades. One story suggests that Nobel disliked the Finnish mathematician Rolf Nevanlinna, and assumed that he would be the first winner of the mathematics prize, if he decided to award one. Alternatively, math undergraduates are often told that Nobel was jealous of a Swedish mathematician who had an affair with his wife (though this story is ruined by the fact that Nobel didn't actually have a wife).


One of the World's Most Influential Math Texts is Getting a Beautiful, Minimalist Edition ( 81

An anonymous reader shares a report: A couple of years ago, a small publisher called Kroncker Wallis issued a handsome, minimalist take on Isaac Newton's Principia. Now, the publisher is embarking on its next project: Euclid's Elements. The publisher is using Kickstarter to fund this new edition. Euclid's Elements is a mathematical text written by Greek mathematician Euclid around 300 BCE and has been called one of the most influential textbooks ever produced. The treatise contains 13 separate books, covering everything from plane geometry, the Pythagorean theorem, golden ratio, prime numbers, and quite a bit more. The books helped to influence scientists such as Nicolaus Copernicus, Johannes Kepler, Galileo Galilei, and Sir Isaac Newton. In 1847, an English mathematician named Oliver Byrne re-wrote the first six books of Euclid's Elements, taking its concepts and illustrating them.

Mathematicians Race To Debunk German Man Who Claimed To Solve The 'P Versus NP' Problem ( 156

A German man -- Norbert Blum -- who claimed that P is not equal to NP is seeing several challenges to his solution. From a report: Numerous mathematicians have begun to raise questions about whether the German mathematician solved it at all. Since Blum's paper was published, mathematicians and computer scientists worldwide have been racking their brains as to whether the Bonn-based researcher has, in fact, solved this Millennium Prize Problem. After an initially positive reaction, such as the one from Stanford mathematician Reza Zadeh, doubts are beginning to arise about whether Blum's reasoning is correct. In a forum for theoretical mathematics, a user named Mikhail reached out to Alexander Razborov -- the author of the paper on which Blum's proof is based -- to ask him about Blum's paper. Razborov purports to have discovered an error in Blum's paper: Blum's main argument contradicts one of Razborov's key assumptions. And mathematician Scott Aaronson, who is something of an authority in the math community when it comes to P vs. NP, said he would be willing to bet $200,000 that Blum's mathematical proof won't endure. "Please stop asking," Aaronson writes. If the proof hasn't been refuted, "you can come back and tell me I was a closed-minded fool." In the week since Aaronson's initial blog post, other mathematicians have begun trying to poke holes in Blum's proof. Dick Lipton, a computer science professor at Georgia Tech, wrote in a blog post that Blum's proof "passes many filters of seriousness," but suggested there may be some problems with it. A commenter on that blog post, known only as "vloodin," noted that there was a "single error on a subtle point" in the proof; other mathematicians have since chimed in and confirmed vloodin's initial analysis, and so the emerging consensus among many mathematicians is that a solve for P vs. NP remains elusive.
United States

The IRS Decides Who To Audit By Data Mining Social Media ( 232

In America the Internal Revenue Service used to pick who got audited based on math mistakes or discrepancies with W-2 forms -- but not any more. schwit1 shares an article from the Vanderbilt Journal of Entertainment and Technology Law describing their new technique: The IRS is now engaging in data mining of public and commercial data pools (including social media) and creating highly detailed profiles of taxpayers upon which to run data analytics. This article argues that current IRS practices, mostly unknown to the general public, are violating fair information practices. This lack of transparency and accountability not only violates federal law regarding the government's data collection activities and use of predictive algorithms, but may also result in discrimination. While the potential efficiencies that big data analytics provides may appear to be a panacea for the IRS's budget woes, unchecked these activities are a significant threat to privacy [PDF]. Other concerns regarding the IRS's entrance into big data are raised including the potential for political targeting, data breaches, and the misuse of such information.
While tax evasion cost the U.S.$3 trillion between 2000 and 2009, one of the report's authors argues that people should be aware âoethat what they say and do onlineâ could be used against them.
The Internet

Cord-Cutting Still Doesn't Beat the Cable Bundle ( 421

I'd like to cut the cord, writes Brian Barrett for Wired, then, the very instant I allow myself to picture what life looks like after that figurative snip, my reverie comes crashing down. From an article: Cutting the cord is absolutely right for some people. Lots of people, maybe. But it's not that cheap, and it's not that easy, and there's not much hope of improvement on either front any time soon. Not to turn this into a math experiment, but let's consider cost. Assuming you're looking for a cord replacement, not abandoning live television altogether, you're going to need a service that bundles together a handful of channels and blips them to your house over the internet. The cheapest way you can accomplish this is to pay Sling TV $20 per month, for which you get 29 channels. That sounds not so bad, and certainly less than your cable bill. But! Sling Orange limits you to a single stream. If you're in a household with others, you'll probably want Sling Blue, which offers multiple streams and 43 channels for $25 per month. But! Sling Orange and Sling Blue have different channel lineups (ESPN is on Orange, not Blue, while Orange lacks FX, Bravo and any locals). For full coverage, you can subscribe to both for $40. But! Have kids? You'll want the Kids Extra package for another $5 per month. Love ESPNU? Grab that $5 per month sports package. HBO? $15 per month, please. Presto, you're up to $65 per month. But! Don't forget the extra $5 for a cloud-based DVR. Plus the high-speed internet service that you need to keep your stream from buffering, which, by the way, it'll do anyway. That's not to pick on Sling TV, specifically. But paying $70 to quit cable feels like smoking a pack of Parliaments to quit Marlboro Lights. You run into similar situations across the board, whether it's a higher base rate, or a limited premium selection, or the absence of local programming altogether. It turns out, oddly enough, that things cost money, whether you access those things through traditional cable packages or through a modem provided to you by a traditional cable operator.

After 15 Years, Maine's Laptops-in-Schools Initiative Fails To Raise Test Scores ( 158

For years Maine has been offering laptops to high school students -- but is it doing more harm than good? An anonymous reader writes: One high school student says "We hardly ever use paper," while another student "says he couldn't imagine social studies class without his laptop and Internet connection. 'I don't think I could do it, honestly... I don't want to look at a newspaper. I don't even know where to get a newspaper!'" But then the reporter visits a political science teacher who "learned what a lot of teachers, researchers and policymakers in Maine have come to realize over the past 15 years: You can't just put a computer in a kid's hand and expect it to change learning."

"Research has shown that 'one-to-one' programs, meaning one student one computer, implemented the right way, increase student learning in subjects like writing, math and science. Those results have prompted other states, like Utah and Nevada, to look at implementing their own one-to-one programs in recent years. Yet, after a decade and a half, and at a cost of about $12 million annually (around one percent of the state's education budget), Maine has yet to see any measurable increases on statewide standardized test scores."

The article notes that Maine de-emphasized teacher training which could've produced better results. One education policy researcher "says this has created a new kind of divide in Maine. Students in larger schools, with more resources, have learned how to use their laptops in more creative ways. But in Maine's higher poverty and more rural schools, many students are still just using programs like PowerPoint and Microsoft Word."

Microsoft Avoids Washington State Taxes, Gives Nevada Schoolkid A Surface Laptop ( 72

theodp writes: The Official Microsoft Blog hopes a letter from a Nevada middle schooler advising Microsoft President Brad Smith to "keep up the good work running that company" will "inspire you like it did us." Penned as part of a math teacher's assignment to write letters to the businesses that they like, Microsoft says the letter prompted Smith to visit the Nevada school to meet 7th-grader Sky Yi in person as part of the company's effort to draw attention to the importance of math and encourage students and teachers who are passionate about STEM (science, technology, engineering and math) education. In an accompanying video of the surprise meeting, Smith presents Yi with a new Surface Laptop that comes with Windows 10 S, a version of the OS that has been streamlined with schools in mind. "Not bad for a little letter," the Microsoft exec says.

Speaking of Microsoft, Nevada, and education, Bing Maps coincidentally shows the school Smith visited is just a 43-minute drive from the software giant's Reno-based Americas Operations Center. According to the Seattle Times, routing sales through the Reno software-licensing office helps Microsoft minimize its tax bills (NV doesn't tax business income) to the detriment, some say, of Washington State public schools.

Microsoft's state and local taxes will drop to just $30 million for the last year (from an average of $214 milion over the previous 14 years) according to the Seattle Times. "A Microsoft spokesman said the decline in 2017 was caused by the company's deferring taxes on some income to future years and the winding down of the company's smartphone business."

MIT Team's School-Bus Algorithm Could Save $5M and 1M Bus Miles ( 104

An anonymous reader shares a report: A trio of MIT researchers recently tackled a tricky vehicle-routing problem when they set out to improve the efficiency of the Boston Public Schools bus system. Last year, more than 30,000 students rode 650 buses to 230 schools at a cost of $120 million. In hopes of spending less this year, the school system offered $15,000 in prize money in a contest that challenged competitors to reduce the number of buses. The winners -- Dimitris Bertsimas, co-director of MIT's Operations Research Center and doctoral students Arthur Delarue and Sebastien Martin -- devised an algorithm that drops as many as 75 bus routes. The school system says the plan, which will eliminate some bus-driver jobs, could save up to $5 million, 20,000 pounds of carbon emissions and 1 million bus miles (Editor's note: the link could be paywalled; alternative source). The computerized algorithm runs in about 30 minutes and replaces a manual system that in the past has taken transportation staff several weeks to complete. "They have been doing it manually many years," Dr. Bertsimas said. "Our whole running time is in minutes. If things change, we can re-optimize." The task of plotting school-bus routes resembles the classic math exercise known as the Traveling Salesman Problem, where the goal is to find the shortest path through a series of cities, visiting each only once, before returning home.
The Almighty Buck

'World of Warcraft' Game Currency Now Worth More Than Venezuelan Money ( 189

schwit1 quotes TheBlaze: Digital gold from Blizzard's massive multiplayer online game "World of Warcraft" is worth more than actual Venezuelan currency, the bolivar, according to new data. Venezuelan resident and Twitter user @KalebPrime first made the discovery July 14 and tweeted at the time that on the Venezuela's black market -- now the most-used method of currency exchange within Venezuela according to NPR -- you can get $1 for 8493.97 bolivars. Meanwhile, a "WoW" token, which can be bought for $20 from the in-game auction house, is worth 8385 gold per dollar. According to sites that track the value of both currencies, KalebPrime's math is outdated, and WoW gold is now worth even more than the bolivar.
That tweet has since gone viral, prompting @KalebPrime to joke that "At this rate when I publish my novel the quotes will read 'FROM THE GUY THAT MADE THE WOW GOLD > VENEZUELAN BOLIVAR TWEET.'"

Math Journal Editors Resign To Start Rival Journal That Will Be Free To Read ( 59

An anonymous reader writes: To protest the high prices charged by their publisher, Springer, the editors of the Journal of Algebraic Combinatorics will start a rival journal that will be free for all to read. The four editors in chief of the Journal of Algebraic Combinatorics have informed their publisher, Springer, of their intention to launch a rival open-access journal to protest the publisher's high prices and limited accessibility. This is the latest in a string of what one observer called "editorial mutinies" over journal publishing policies. In a news release, the editors said their decision was not made because of any "particular crisis" but was the result of it becoming "more and more clear" that Springer intended to keep charging readers and authors large fees while "adding little value."

Apple's Adoption Of HEVC Will Drive A Massive Increase In Encoding Costs Requiring Cloud Hardware Acceleration ( 203

An anonymous reader shares a report: For the last 10 years, H.264/AVC has been the dominant video codec used for streaming but with Apple adopting H.265/HEVC in iOS 11 and Google heavily supporting VP9 in Android, a change is on the horizon. Next year the Alliance for Open Media will release their AV1 codec which will again improve video compression efficiency even further. But the end result is that the codec market is about to get very fragmented, with content owners soon having to decide if they need to support three codecs (H.264, H.265, and VP9) instead of just H.264 and with AV1 expected to be released in 2019. As a result of what's take place in the codec market, and with better quality video being demanded by consumers, content owners, broadcasters and OTT providers are starting to see a massive increase in encoding costs. New codecs like H.265 and VP9 need 5x the servers costs because of their complexity. Currently, AV1 needs over 20x the server costs. The mix of SD, HD and UHD continues to move to better quality: e.g. HDR, 10-bit and higher frame rates. Server encoding cost to move from 1080p SDR to 4K HDR is 5x. 360 and Facebook's 6DoF video are also growing in consumption by consumers which again increases encoding costs by at least 4x. If you add up all these variables, it's not hard to do the math and see that for some, encoding costs could increase by 500x over the next few years as new codecs, higher quality video, 360 video and general demand increases.

HackerRank Tries To Calculate Which US States Have The Best Developers ( 66

An anonymous reader writes: Palo Alto-based HackerRank, which offers online programmng challenges, "dug into our data of about 450,000 unique U.S. developers to uncover which states are home to the best software engineers, and which pockets of the country have the highest rate of developer growth." Examining the 24 months from 2015 through the end of 2016, they calculated the average score for each state in eight programming-related domains. (Algorithms, data structures, functional programming, math, Java, Ruby, C++, and Python.) But it seems like low-population states would have fewer people taking the tests, meaning a disproportionate number of motivated and knowledgeable test takers could drastically skew the results. Sure enough, Wyoming -- with a population of just 584,153 -- has the smallest population of any U.S. state, but the site's second-highest average score, and the top score in three subject domains -- Ruby, data structures, and algorithms. And the District of Columbia -- population 681,170 -- has the highest average score for functional programming.

California, New York and Virginia still had the highest number of developers using the site, while Alaska, Wyoming and South Dakota not surprisingly had the least number of developers. But maybe the real take-away is that programmers are now becoming more distributed. HackerRank's announcement notes that the site "found growing developer communities and skilled developers all across the country. Previously, the highest concentrations of developers did not stray far from the tech hubs in California. Hawaii, Colorado, Virginia, and Nevada demonstrated the fastest growth in terms of developer activity on the HackerRank platform..." In addition, "we've had a noticeable uptick in customers across industries, from healthcare to retail and finance, with strong demand for identifying technical skills quickly."

Their conclucion? "Today, as the demand for developers goes beyond technology and as there is more opportunity to work remotely, there's a more distributed workforce of skilled developers across the nation, from the Rust Belt to the East Coast... Software developers aren't just attached to VCs, startups or Silicon Valley anymore."

A New Sampling Algorithm Could Eliminate Sensor Saturation ( 135

Baron_Yam shared an article from Science Daily: Researchers from MIT and the Technical University of Munich have developed a new technique that could lead to cameras that can handle light of any intensity, and audio that doesn't skip or pop. Virtually any modern information-capture device -- such as a camera, audio recorder, or telephone -- has an analog-to-digital converter in it, a circuit that converts the fluctuating voltages of analog signals into strings of ones and zeroes. Almost all commercial analog-to-digital converters (ADCs), however, have voltage limits. If an incoming signal exceeds that limit, the ADC either cuts it off or flatlines at the maximum voltage. This phenomenon is familiar as the pops and skips of a "clipped" audio signal or as "saturation" in digital images -- when, for instance, a sky that looks blue to the naked eye shows up on-camera as a sheet of white.

Last week, at the International Conference on Sampling Theory and Applications, researchers from MIT and the Technical University of Munich presented a technique that they call unlimited sampling, which can accurately digitize signals whose voltage peaks are far beyond an ADC's voltage limit. The consequence could be cameras that capture all the gradations of color visible to the human eye, audio that doesn't skip, and medical and environmental sensors that can handle both long periods of low activity and the sudden signal spikes that are often the events of interest.

One of the paper's author's explains that "The idea is very simple. If you have a number that is too big to store in your computer memory, you can take the modulo of the number."

Crypto-Bashing Prime Minister Argues The Laws Of Mathematics Don't Apply In Australia ( 330

An anonymous reader quotes the Independent:Australian Prime Minister Malcolm Turnbull has said the laws of mathematics come second to the law of the land in a row over privacy and encryption... When challenged by a technology journalist over whether it was possible to tackle the problem of criminals using encryption -- given that platform providers claim they are currently unable to break into the messages even if required to do so by law -- the Prime Minister raised eyebrows as he made his reply. "Well the laws of Australia prevail in Australia, I can assure you of that. The laws of mathematics are very commendable, but the only law that applies in Australia is the law of Australia," he said... "The important thing is to recognise the challenge and call on the companies for assistance. I am sure they know morally they should... They have to face up to their responsibility."
Facebook has already issued a statement saying that they "appreciate the important work law enforcement does, and we understand the need to carry out investigations. That's why we already have a protocol in place to respond to any requests we can.

"At the same time, weakening encrypted systems for them would mean weakening it for everyone."

Mathematical Biology Is Our Secret Weapon In the Fight Against Disease ( 57

An anonymous reader shares excerpts from a Scientific American article: In recent years, increasingly detailed experimental procedures have lead to a huge influx in the biological data available to scientists. This data is being used to generate hypotheses about the complexity of previously abstruse biological systems. In order to test these hypotheses, they must be written down in the form of a model which can be interrogated to determine whether it correctly mimics the biological observations. Mathematics is the natural language in which to do this. In addition, the advent of, and subsequent increase in, computational ability over the last 60 years has enabled us to suggest and then interrogate complex mathematical models of biological systems. The realisation that biological systems can be treated mathematically, coupled with the computational ability to build and investigate detailed biological models, has led to the dramatic increase in the popularity of mathematical biology. Maths has become a vital weapon in the scientific armoury we have to tackle some of the most pressing questions in medical, biological and ecological science in the 21st century. By describing biological systems mathematically and then using the resulting models, we can gain insights that are impossible to access though experiments and verbal reasoning alone. Mathematical biology is incredibly important if we want to change biology from a descriptive into a predictive science -- giving us power, for example, to avert pandemics or to alter the effects of debilitating diseases.

The Quirky Habits of Certified Science Geniuses ( 190

dryriver shares a report from the BBC: Celebrated inventor and physicist Nikola Tesla swore by toe exercises -- every night, he'd repeatedly "squish" his toes, 100 times for each foot, according to the author Marc J Seifer. While it's not entirely clear exactly what that exercise involved, Tesla claimed it helped to stimulate his brain cells. The most prolific mathematician of the 20th Century, Paul Erdos, preferred a different kind of stimulant: amphetamine, which he used to fuel 20-hour number benders. When a friend bet him $500 that he couldn't stop for a month, he won but complained "You've set mathematics back a month." Newton, meanwhile, bragged about the benefits of celibacy. When he died in 1727, he had transformed our understanding of the natural world forever and left behind 10 million words of notes; he was also, by all accounts, still a virgin (Tesla was also celibate, though he later claimed he fell in love with a pigeon). It's common knowledge that sleep is good for your brain -- and Einstein took this advice more seriously than most. He reportedly slept for at least 10 hours per day -- nearly one and a half times as much as the average American today (6.8 hours). But can you really slumber your way to a sharper mind? Many of the world's most brilliant scientific minds were also fantastically weird. From Pythagoras' outright ban on beans to Benjamin Franklin's naked "air baths," the path to greatness is paved with some truly peculiar habits.

Jean Sammet, Co-Designer of COBOL, Dies at 89 ( 73

theodp writes: A NY Times obituary reports that early software engineer and co-designer of COBOL Jean Sammet died on May 20 in Maryland at age 89. "Sammet was a graduate student in math when she first encountered a computer in 1949 at the Univ. of Illinois at Urbana-Champaign," the Times reports. While Grace Hopper is often called the "mother of COBOL," Hopper "was not one of the six people, including Sammet, who designed the language -- a fact Sammet rarely failed to point out... 'I yield to no one in my admiration for Grace,' she said. 'But she was not the mother, creator or developer of COBOL.'"
By 1960 the Pentagon had announced it wouldn't buy computers unless they ran COBOL, inadvertently creating an industry standard. COBOL "really was very good at handling formatted data," Brian Kernighan, tells the Times, which reports that today "More than 200 billion lines of COBOL code are now in use and an estimated 2 billion lines are added or changed each year, according to IBM Research."

Sammet was entirely self-taught, and in an interview two months ago shared a story about how her supervisor in 1955 had asked if she wanted to become a computer programmer. "What's a programmer?" she asked. He replied, "I don't know, but I know we need one." Within five years she'd become the section head of MOBIDIC Programming at Sylvania Electric Products, and had helped design COBOL -- before moving on to IBM, where she worked for the next 27 years and created the FORTRAN-based computer algebra system FORMAC.

When AI Botches Your Medical Diagnosis, Who's To Blame? ( 200

Robert Hart has posed an interested question in his report on Quartz: When artificial intelligence botches your medical diagnosis, who's to blame? Do you blame the AI, designer or organization? It's just one of many questions popping up and starting to be seriously pondered by experts as artificial intelligence and automation continue to become more entwined into our daily lives. From the report: The prospect of being diagnosed by an AI might feel foreign and impersonal at first, but what if you were told that a robot physician was more likely to give you a correct diagnosis? Medical error is currently the third leading cause of death in the U.S., and as many as one in six patients in the British NHS receive incorrect diagnoses. With statistics like these, it's unsurprising that researchers at Johns Hopkins University believe diagnostic errors to be "the next frontier for patient safety." Of course, there are downsides. AI raises profound questions regarding medical responsibility. Usually when something goes wrong, it is a fairly straightforward matter to determine blame. A misdiagnosis, for instance, would likely be the responsibility of the presiding physician. A faulty machine or medical device that harms a patient would likely see the manufacturer or operator held to account. What would this mean for an AI?

Slashdot Top Deals