Happy Ada Lovelace Day ( 124

Today is Ada Lovelace Day, a time to celebrate the achievements of women in STEM fields. Several publications have put together lists of notable women to commemorate the day, such as tech pioneers, robotics experts, and historical engineers and scientists. Other are taking the opportunity to keep pushing against the elements of tech culture that remain sexist. From the BBC: On Ada Lovelace Day, four female engineers from around the world share their experiences of working in male-dominated professions. When Isis Anchalee's employer OneLogin asked her to take part in its recruitment campaign, she didn't rush to consult the selfie-loving Kardashian sisters for styling tips. "I was wearing very minimal make-up. I didn't brush my hair that day," she said. But the resulting image of Ms Anchalee created a social media storm when it appeared on Bart, the San Francisco metro. Lots of people questioned whether she really was an engineer. "It was not just limited to women — it resonates with every single person who doesn't fit with what the stereotype should look like," she said.

"My parents, my brother, my community, all were against me," said Sovita Dahal of her decision to pursue a career in technology. "I was going against traditional things. In my schooldays I was fascinated by electronic equipment like motors, transformers and LED lights. Later on this enthusiasm became my passion and ultimately my career," she said.


Review: The Martian Screenshot-sm 241

I was both pleased and disappointed, as always, when I heard that a book I enjoyed was being made into a movie. Andy Weir's The Martian was the best new book I'd read in years. It was written for nerds, by a nerd — by somebody with an obvious love for NASA, science, and spaceflight. How could it possibly be condensed into the format of a Hollywood blockbuster? Well, director Ridley Scott and screenwriter Drew Goddard figured out how. The Martian is an excellent film, well worth watching. Read on for my review (very minor spoilers only), and feel free to share your own in the comments.

Barbie Gets a Brain 235

minstrelmike writes: Mattel is coming out with a Talking Barbie designed by a huge team and pre-scripted with thousands of responses controlled by an AI, with designs to be your best friend. The design team remembers the "Math is hard" debacle of the 1990s and if a girl asks if she's pretty, Barbie will respond, "Yes. And you're smart, too." If she asks if Barbie believes in God, she says a person's beliefs are personal. And suggests talking to grownups about some problems. The linked New York Times' article ("Barbie Wants to Get to Know Your Child") even discusses trying to avoid edited vids on YouTube by scripting out words such as "cockroach."

The Handheld Analog Computer That Made the Atomic Bomb 45

szczys writes: When the physicists and mathematicians of the Manhattan Project began their work they needed to establish which substance was most likely to sustain vigorous fission. This is not trivial math, and the solution of course is to use an advanced computer. If only they had one available. The best computer of the time was a targeting calculation machine that was out of service while being moved from one installation to another. The unlikely fill-in was a simple yet ingenious analog computer called the FERMIAC. When rolled along a piece of paper it calculated neutron collisions with simple markings — doing its small part to forever change the world without a battery, transistor, or tube.

MIT Simplifies Design Process For 3D Printing 45

An anonymous reader writes: New software out of MIT and the Interdisciplinary Center Herzliya in Israel takes CAD files and automatically builds visual models that users can alter with simple, visual sliders. It works by computing myriad design variations before a user asks for them. When the CAD file is loaded, the software runs through a host of size variations on various properties of the object, evaluating whether the changes would work in a 3D printer, and doing the necessary math to plan tool routes. When a user moves one of the sliders, it switches the design along these pre-computer values. "The system automatically weeds out all the parameter values that lead to unprintable or unstable designs, so the sliders are restricted to valid designs. Moving one of the sliders — changing the height of the shoe's heel, say, or the width of the mug's base — sweeps through visual depictions of the associated geometries."

There are two big drawbacks: first, it requires a lot of up-front processing power to compute the variations on an object. Second, resolution for changes is fixed if you want quick results — changing the design for a pair of 3D-printed shoes from size 8 to size 9 might be instantaneous, but asking for a shoe that's a quarter of a millimeter longer than a size 8 would take several minutes to process. But for scrolling through the pre-computed design changes, the software can present "in real time what would take hours to calculate with a CAD program," and without the requisite experience with CAD.

Ada Lovelace and Her Legacy 139

nightcats writes: Nature has an extensive piece on the legacy of the "enchantress of abstraction," the extraordinary Victorian-era computer pioneer Ada Lovelace, daughter of the poet Lord Byron. Her monograph on the Babbage machine was described by Babbage himself as a creation of "that Enchantress who has thrown her magical spell around the most abstract of Sciences and has grasped it with a force that few masculine intellects (in our own country at least) could have exerted over it." Ada's remarkable merging of intellect and intuition — her capacity to analyze and capture the conceptual and functional foundations of the Babbage machine — is summarized with a historical context which reveals the precocious modernity of her scientific mind. "By 1841 Lovelace was developing a concept of 'Poetical Science', in which scientific logic would be driven by imagination, 'the Discovering faculty, pre-eminently. It is that which penetrates into the unseen worlds around us, the worlds of science.' She saw mathematics metaphysically, as 'the language of the unseen relations between things;' but added that to apply it, 'we must be able to fully appreciate, to feel, to seize, the unseen, the unconscious.' She also saw that Babbage's mathematics needed more imaginative presentation."

You Don't Have To Be Good At Math To Learn To Code 616 writes: Olga Khazan writes in The Atlantic that learning to program involves a lot of Googling, logic, and trial-and-error—but almost nothing beyond fourth-grade arithmetic. Victoria Fine explains how she taught herself how to code despite hating math. Her secret? Lots and lots of Googling. "Like any good Google query, a successful answer depended on asking the right question. "How do I make a website red" was not nearly as successful a question as "CSS color values HEX red" combined with "CSS background color." I spent a lot of time learning to Google like a pro. I carefully learned the vocabulary of HTML so I knew what I was talking about when I asked the Internet for answers." According to Khazan while it's true that some types of code look a little like equations, you don't really have to solve them, just know where they go and what they do. "In most cases you can see that the hard maths (the physical and geometry) is either done by a computer or has been done by someone else. While the calculations do happen and are essential to the successful running of the program, the programmer does not need to know how they are done." Khazan says that in order to figure out what your program should say, you're going to need some basic logic skills and you'll need to be skilled at copying and pasting things from online repositories and tweaking them slightly. "But humanities majors, fresh off writing reams of term papers, are probably more talented at that than math majors are."

Machine Learning Could Solve Economists' Math Problem 157

An anonymous reader writes: Noah Smith argues that the field of economics frequently uses math in an unhealthy way. He says many economists don't use math as a tool to describe reality, but rather as an abstract foundation for whatever theory they've come up with. A possible solution to this, he says, is machine learning: "In other words, econ is now a rogue branch of applied math. Developed without access to good data, it evolved different scientific values and conventions. But this is changing fast, as information technology and the computer revolution have furnished economists with mountains of data. As a result, empirical analysis is coming to dominate econ. ... [Two economists pushing this change] stated that machine learning techniques emphasized causality less than traditional economic statistical techniques, or what's usually known as econometrics. In other words, machine learning is more about forecasting than about understanding the effects of policy. That would make the techniques less interesting to many economists, who are usually more concerned about giving policy recommendations than in making forecasts."

John Conway: All Play and No Work For a Genius 55

An anonymous reader points out Quanta's spotlight piece on mathematician John Conway, whose best known mathematical contribution is probably his "Game of Life," which has inspired many a screensaver and more than a few computer science careers. From the article: Based at Princeton University, though he found fame at Cambridge (as a student and professor from 1957 to 1987), John Horton Conway, 77, claims never to have worked a day in his life. Instead, he purports to have frittered away reams and reams of time playing. Yet he is Princeton's John von Neumann Professor in Applied and Computational Mathematics (now emeritus). He's a fellow of the Royal Society. And he is roundly praised as a genius. "The word 'genius' gets misused an awful lot," said Persi Diaconis, a mathematician at Stanford University. "John Conway is a genius. And the thing about John is he'll think about anything. He has a real sense of whimsy. You can't put him in a mathematical box."
Open Source

Debian Founder: How I Came To Find Linux 136

An anonymous reader writes: Ian Murdock has pretty solid open source cred: in 1993 he founded Debian, he was the CTO of Progeny and the Linux Foundation, and he helped pave the way for OpenSolaris. He has published a post about how he initially joined the Linux ecosystem. Quoting: "[In 1992], I spent most evenings in the basement of the MATH building basking in the green phosphorescent glow of the Z-29 terminals, exploring every nook and cranny of the UNIX system upstairs. ... I was also accessing UNIX from home via my Intel 80286-based PC and a 2400-baud modem, which saved me the trek across campus to the computer lab on particularly cold days. Being able to get to the Sequent from home was great, but I wanted to replicate the experience of the ENAD building's X terminals, so one day, in January 1993, I set out to find an X server that would run on my PC. As I searched for such a thing on Usenet, I stumbled across something called 'Linux.'" How did you come to find Linux?

How Weather Modeling Gets Better 43

Dr_Ish writes: Bob Henson over at Weather Underground has posted a fascinating discussion of the recent improvements made to the major weather models that are used to forecast hurricanes and the like. The post also included interesting links that explain more about the models. Quoting: "The latest version of the ECMWF model, introduced in May, has significant changes to model physics and the ways in which observations are brought into and used within the model. The overall improvements include better portrayal of clouds and precipitation, including a more accurate depiction of intense rainfall. The main effect of the model upgrade for tropical cyclones is slightly lower central pressure. During the first 3 days of a forecast, the ECMWF has tended to have a slight weak bias on tropical cyclones; the new version is closer to the mark."

XKCD Author's New Unpublished Book Becomes Scientific Best-Seller 90

An anonymous reader writes: XKCD cartoonist Randall Munroe will be publishing a new book in November, but it's already become Amazon's #1 best-seller in two "Science & Math" subcategories, for mechanics and scientific instruments. Inspired by a cartoon describing NASA's Saturn V rocket as "the up-goer V", Randall's created a large-format collection of blueprints describing datacenters, tectonic plates, and even the controls in an airplane cockpit — using only the thousand most common English words. "Since this book explains things, I've called it Thing Explainer," Randall writes on the XKCD blog, trying to mimic the humorously simple style of his book. Randall's previous book of scientific hypotheticals — published one year ago — is still Amazon's #1 best-selling book in their "Physics" category, ranking higher than Stephen Hawking's "A Brief History of Time."

UK Industry Group Boss: Study Arts So Games Are Not Designed By 'Spotty Nerds' 207

nickweller writes: John Cridland is the leader of the Confederation of British Industry, a group that represents over 100,000 UK businesses. In a recent interview, he spoke about his enthusiasm for adding arts education to more traditional STEM (science, technology, engineering, and math) programs. Here's how he chose to express that: "One of the biggest growth industries in Britain today is the computer games industry. We need extra coders — dozens and dozens of them but nobody is going to play a game designed by a spotty nerd. We need people with artistic flair." Cridland also expressed support for an increased emphasis on foreign language education: "If we’re not capable of speaking other people’s languages, we’re going to be in difficulties. However, there is far too much emphasis placed on teaching French and German. The language we most need going forward is Spanish (the second most frequently spoken language in the world). That and a certain percentage need to learn Mandarin to develop relations with China."

The Connoisseur of Number Sequences 63

An anonymous reader writes: 75-year-old Neil Sloane is considered by many to be one of the most influential mathematicians of our time, not because of the theorems he's proved, but because of his creation: The Online Encyclopedia of Integer Sequences (OEIS). Quanta Magazine reports: "This giant repository, which celebrated its 50th anniversary last year, contains more than a quarter of a million different sequences of numbers that arise in different mathematical contexts, such as the prime numbers (2, 3, 5, 7, 11 ) or the Fibonacci sequence (0, 1, 1, 2, 3, 5, 8, 13 ). What's the greatest number of cake slices that can be made with n cuts? Look up sequence A000125 in the OEIS. How many chess positions can be created in n moves? That's sequence A048987. The number of ways to arrange n circles in a plane, with only two crossing at any given point, is A250001. That sequence just joined the collection a few months ago. So far, only its first four terms are known; if you can figure out the fifth, Sloane will want to hear from you."

Using Math To Tune a Video Game's Economy 96

An anonymous reader writes: When the shipping deadline was approaching for The Witcher 3, designer Matthew Steinke knew there was a big part of the game still missing: its economy. A game's economy is one of the things that can make or break immersion — you want collection and rewards to feel progressive and meaningful. Making items too expensive gives the game a grindy feel, while making them too cheap makes progression trivial. At the Game Developers Conference underway in Germany, Steinke explained his solution.

"Steinke created a formula that calculated attributes like how much damage, defense, or healing that each item provided, and he placed them into an overall combat rating could be used to rank other items in the system. ... Steinke set about blending the sub-categories into nine generalized categories, allowing him to determine the final weighting for damage and the range of prices for each item. To test if it all worked, he used polynomial least squares (a form of mathematical statistics) to chart each category's price progression. The resultant curve (pictured below) showed the rate at which spending was increasing as the quality of each item approached the category's ceiling value."

Behind the Microsoft Write-Off of Nokia 200

UnknowingFool writes: Previously Microsoft announced they had written off the Nokia purchase for $7.6B in the last quarter. In doing so, Microsoft would create only the third unprofitable quarter in the company's history. Released on July 31, new financial documents detail some of the reasoning and financials behind this decision. At the core of the problem was that the Phone Hardware business was only worth $116M, after adjusting for costs and market factors. One of those factors was poor sales of Nokia handhelds in 2015. Financially it made more sense to write it all off.

Microsoft Creates a Quantum Computer-Proof Version of TLS Encryption Protocol 128

holy_calamity writes: When (or if) quantum computers become practical they will make existing forms of encryption useless. But now researchers at Microsoft say they have made a quantum-proof version of the TLS encryption protocol we could use to keep online data secure in the quantum computing era. It is based on a mathematical problem very difficult for both conventional and quantum computers to crack. That tougher math means data moved about 20 percent slower in comparisons with conventional TLS, but Microsoft says the design could be practical if properly tuned up for use in the real world.

For the Love of the Analytics of the Game: Before Beane, There Was AVM Systems 16

theodp writes: Those of you slugging your way through EdX's (free) Sabermetrics 101: Introduction to Baseball Analytics MOOC course might want to take a break from your R and SQL coding to check out Grantland's Before Beane, in which Ben Lindbergh tells the story of AVM Systems, the little-known company that jump-started sabermetrics and made Moneyball possible. Ken Mauriello, whose love-for-the-analytics-of-the-game led him to ditch a trading career to co-found AVM in the mid-90's, said of the early days, "Back in the day we weren't doing presentations [to skeptical MLB teams] with laptops. We were carrying around two enormous boxes with an enormous monitor and an enormous tower. It was like Planes, Trains & Automobiles traveling around with that stuff. Watching a great big Gateway box with your monitor come tumbling out upside down, and you pick it up and it's rattling. ... So we're in the hotel, saying, 'Please lord, let this thing work.'"

AMD Forces a LibreOffice Speed Boost With GPU Acceleration 144

New submitter samtuke writes: AMD processors get rated and reviewed based on performance. It is in our self-interest to make things work really, really fast on AMD hardware. AMD engineers contribute to LibreOffice, for good reason. Think about what happens behind a spreadsheet calculation. There can be a huge amount of math. Writing software to take advantage of a Graphics Processing Unit (GPU) for general purpose computing is non-trivial. We know how to do it. AMD engineers wrote OpenCL kernels, and contributed them to the open source code base. Turning on the OpenCL option to enable GPU Compute resulted in a 500X+ speedup, about ¼ second vs. 2minutes, 21 seconds. Those measurements specifically come from the ground-water use sample from this set of Libre Office spreadsheets.