Welcome to Red Slime.com

Featured

Welcome to Red Slime, the personal journal of Orange Claymore. This site contains work I have done over the years, audio and video. Things i like to do are as followed; skateboarding, computers, video games, listening to bad music, guns and creating music (check noise section). I also pass live feeds of interesting and related topics. Explore and have fun.

UPDATE: The new 404_Error band archive site and my old project, bland officer, is now active on the Noise page!

Dyson’s Humidifier Uses UV Light To Kill Germs In its Water Reservoir

Dyson's Humidifier Uses UV Light To Kill Germs In its Water Reservoir

The slow but steady approach of winter means that it’s almost time for many of us to fire up our heaters—also heralding the return of of chapped lips and dry skin. Dyson’s new humidifier is one solution to the problem, but it doesn’t only prevent dry air. It also ensures your home isn’t being filled with bacteria-ridden moisture thanks to a germ-killing UV light.

Dyson's Humidifier Uses UV Light To Kill Germs In its Water Reservoir

Based on the design of Dyson’s well-regarded bladeless fans and heaters, the company’s new humidifier introduces a three-liter water reservoir that uses a piezoelectric transducer (vibrating up to 1.7 million times a second) to produce tiny airborne microscopic water particles. But the water in that reservoir is just as susceptible to infiltration by germs and bacteria as your dry winter sinuses are.

To ensure that its new humidifier isn’t just spreading sickness around your home, the water in the reservoir is exposed to an ultraviolet light twice to kill 99.9 percent of bacteria before the drops are sent drifting through your home. So the odds of you getting sick and missing work are greatly reduced (whether that’s a pro or a con is up to you). And because the humidifier uses Dyson’s Air Multiplier technology, it should do a much better job at boosting the humidity throughout your entire home, and not just around the general vicinity of the actual device.

Dyson's Humidifier Uses UV Light To Kill Germs In its Water Reservoir

Don’t expect relief this winter, though, unless you live in Japan. Because like with its new Eye 360 robotic vacuum, Dyson is holding off on a US release until next year, and its new humidifier isn’t expect to hit stores hear until the fall of 2015. [Dyson]

Dyson's Humidifier Uses UV Light To Kill Germs In its Water Reservoir

Source Article from http://feeds.gawker.com/~r/gizmodo/full/~3/UW4c9ROfNZM/dysons-humidifier-uses-uv-light-to-kill-germs-in-its-wa-1646133657

A Microsoft smartwatch could appear in weeks

Do you remember Spot? Well, that was Microsoft’s very early smartwatch effort, lost in the mists of bygone technology. Finally, a long time coming, a more modern effort from the company is apparently on its way– and coming soon. According to Forbes’ anonymous sources, Microsoft’s next smartwatch will be able to passively track your heart-rate (meaning less stress on the battery) and work across several mobile platforms. Both points make a lot of sense, but the latter could be especially important if Microsoft wants a hit: Windows Phone is still a very distant third to both Android devices and the iPhone. The rumored product could also explain why the heck Microsoft developed a smartwatch keyboard in the first place. We’ll let you know more when we hear it.


















Source Article from http://www.engadget.com/2014/10/20/microsoft-smartwatch-coming/?ncid=rss_truncated

High Altitude Balloon Keeps Going

Baloon

Here’s a post from the AMSAT-UK high altitude balloon blog. It’s a great story about a balloon cruising at about 12km above the Earth completing its sixth circumnavigation of the planet. That post is from October 4th, and two weeks later the balloon is still going strong. Right now it’s over the Baltic heading into Russia with no sign of stopping or popping any time soon.

globeThe balloon was launched July 12, 2014 from Silverstone, UK. In the 100 days since then, this balloon has covered 144168 kilometers and has crossed its launching longitude six times. Even if this balloon weren’t trapped at high latitudes (including coming within 9 km of the pole), this balloon has still travelled more than three times the equatorial circumference of the Earth.

The balloon was built by [Leo Bodnar] a.k.a. [M0XER] with a self-made plastic foil envelope. The solar-powered payload weighs only 11 grams. It’s an exceptional accomplishment and one that has smashed all the amateur high altitude balloon distance records we can find.

Source Article from http://feedproxy.google.com/~r/hackaday/LgoM/~3/ft3I1Tlz75Y/

This Short Film Explains Our Fascination With Talking to Computers

Using names like Siri, Cortana, and Google Now, advanced algorithms and technologies that would have baffled engineers and scientists half a century ago now rest in the palm of our hands. Talking with technology is the future of computing—mainly because that’s the way we’re built to communicate.

Google’s short eight-minute documentary, Behind the Mic: The Science of Talking With Computers explores humanity’s obsession with conversing with machines and the challenges of developing language learning algorithms. At its most basic, this drive for verbal interaction with our tech comes from how human speech develops, as Google computer scientist Geoffrey Hinton video explains:

We come into this world with the innate abilities to learn how to interact with other sentient beings. Supposed you had to interact with other people by writing little messages to them. It would be a real pain. That’s how we interact with computers. It’s much easier to talk to them. It’s just so much easier if the computer can understand what we’re saying.

Despite using keyboards for decades—making us more comfortable with the convenience of text than actually talking—a recent Google study shows that teenage smartphone users are more likely to use voice search than their parents’ generation.

But how we got to this point is actually a 62-year-long epic, starting with Bell Laboratories in 1952, which developed a machine that could only recognize numbers spoken by one specific person. Carnegie Mellon’s Harpy speech recognition system and other mathematical approaches, like the Hidden Markov model, began the slow trek toward what we recognize today as speech recognition.

The video goes on to talk about how modern technology slices and dices language down to phonemes, the building blocks of language, but how that impressive feat of engineering only represents part of creating real discourse between man and machine. The next big step will be language learning, which Google engineers and scientists seem convinced will come in the form of neural nets, essentially mimicking the way our brain interprets language.

Although speech recognition has plenty room for improvement, the next time you curse Siri for misquoting your conversation, take a second to marvel at the interaction that’s actually taking place. [9to5Google]

Source Article from http://feeds.gawker.com/~r/gizmodo/full/~3/9Ej-Gd9-SYc/this-short-film-explains-our-fascination-with-talking-t-1648157524

Watch Jony Ive and Elon Musk talk design and sci-fi transportation

Apple's Jony Ive speaking with Vanity Fair

Tired of hearing little more than soundbites from tech luminaries such as Apple’s Jony Ive and Tesla’s Elon Musk? Today’s your lucky day. Vanity Fair has posted its full video interviews with both Ive and Musk, giving you an insight into how the two executives work. Not surprisingly, Ive’s chat focuses on his design philosophies and processes, including what he thinks of Xiaomi’s eerily familiar-looking products (spoiler: he doesn’t see them as “flattery”). Musk, meanwhile, drops both hints about Tesla’s semi-automated Model S P85D and discusses the motivations behind the science fiction-inspired transport from SpaceX and Tesla, including why it’s important for humanity to go to Mars. The two discussions are lengthy at about half an hour each, but they’re definitely worthwhile if you want to see what makes key industry figures tick.

[Image credit: Kimberly White/Getty Images for Vanity Fair]

















Source Article from http://www.engadget.com/2014/10/19/jony-ive-elon-musk-video-interviews/?ncid=rss_truncated

Playing Doom (Poorly) on a VoCore

doomguy Last May brought the unastonishing news that companies were taking the Systems on Chip found in $20 wireless routers and making dev boards out of them. The first of these is the VoCore, an Indiegogo campaign for a 360MHz CPU with 8MB of Flash and 32MB or RAM packaged in a square inch PCB for the Internet of Things. Now that the Indiegogo rewards are heading out to workbenches the world over, it was only a matter of time before someone got Doom to run on one of them.

After fixing some design flaws in the first run of VoCores, [Pyrofer] did the usual things you would do with a tiny system running Linux – webcams for streaming video, USB sound cards to play internet radio, and the normal stuff OpenWrt does.

His curiosity satiated, [Pyrofer] turned to more esoteric builds. WIth a color LCD from Sparkfun, he got an NES emulator running. This is all through hardware SPI, mind you. Simple 2D graphics are cool enough, but the standard graphical test for all low powered computers is, of course, Doom.

The game runs, but just barely. Still, [Pyrofer] is happy with the VoCore and with a little more work with the SPI and bringing a framebuffer to his tiny system, he might have a neat portable Doom machine on his hands.

Source Article from http://feedproxy.google.com/~r/hackaday/LgoM/~3/f9DuFsvl74g/

NPR Tells The Other Story of Women In Tech

NPR Tells The Other Story of Women In Tech

We’re all used to hearing about women, or rather, the lack of, in the technology industry. But as NPR’s Planet Money, points out, things weren’t always that way: back at the dawn of the IT age, women were a major player in the computer science field. The question is: what happened in 1984?

According to NPR’s excellent graph of applied women majors by field, the percentage of women was rising in lockstep with other college majors, right up until 1984, when the numbers flatten off, and then nosedived, back down to the below 20 percent levels seen today.

NPR’s answer isn’t necessarily one you’d expect: it’s that starting in 1984, high-schoolers who wanted to go into computer science programs had to have had access to a computer at home — something that was far more likely for the boys of the day, thanks to the way that home computers were marketed at the time, as a pursuit exclusively for boys. As such, when computers entered the home, it was as a male-oriented gaming machine — something that hasn’t changed since.

The result is that as a completely unintended consequenceof how a new toy was marketed, an entire generation of girls was subtly disadvantaged when it came to applying for a computer science programme — and, moreover, less inclined to apply in the first place. The full radio story is only 17 minutes, and is definitely worth a listen, if nothing else then as an excellent example of the law of unintended consequences. [NPR]

Source Article from http://feeds.gawker.com/~r/gizmodo/full/~3/KkO4Ag35mUE/npr-tells-the-other-story-of-women-in-tech-1648057516

Disney rendered its new animated film on a 55,000-core supercomputer

Disney’s upcoming animated film Big Hero 6, about a boy and his soft robot (and a gang of super-powered friends), is perhaps the largest big-budget mash-up you’ll ever see. Every aspect of the film’s production represents a virtual collision of worlds. The story, something co-director Don Hall calls “one of the more obscure titles in the Marvel universe,” has been completely re-imagined for parent company Disney. Then, there’s the city of San Fransokyo it’s set in — an obvious marriage of two of the most tech-centric cities in the world. And, of course, there’s the real-world technology that not only takes center stage as the basis for characters in the film, but also powered the onscreen visuals. It’s undoubtedly a herculean effort from Walt Disney Animation Studios, and one that’s likely to go unnoticed by audiences.


“We’ve said it many, many times. We made the movie on a beta renderer,” says Hank Driskill, technical supervisor for Big Hero 6. “It was very much in progress.” Driskill is referring to Hyperion, the software Disney created from the ground up to handle the film’s impressive lighting. It’s just one of about three dozen tools the studio used to bring the robotics-friendly world of San Fransokyo to life. Some, like the program Tonic originally created for Rapunzel’s hair in Tangled, are merely improved versions of software built for previous efforts, or “shows” as Disney calls them. Hyperion, however, represents the studio’s greatest and riskiest commitment to R&D in animation technology thus far. And its feasibility wasn’t always a sure thing, something Disney’s Chief Technology Officer Andy Hendrickson underscores when he says, “It’s the analog to building a car while you’re driving it.”

“We’ve said it many, many times. We made the movie on a beta renderer,” says Hank Driskill, technical supervisor for Big Hero 6.

For that reason, Hendrickson instructed his team to embark on two development paths for Big Hero 6: the experimental Hyperion and a Plan B that hinged on a commodity renderer. It took a team of about 10 people over two years to build Hyperion, during which time Driskill says resources were being spread thin: “We were running with a backup plan until around June of last year … [and] we realized we were spending too much energy keeping the backup plan viable. It was detracting in manpower … from pursuing the new idea as fully as we could. So we just said, ‘We’re gonna go for it.’ And we turned off the backup plan.”

Hyperion, as the global-illumination simulator is known, isn’t the kind of technology that would excite the average moviegoer. As Hendrickson explains, it handles incredibly complex calculations to account for how “light gets from its source to the camera as it’s bouncing and picking up colors and illuminating other things.” This software allowed animators to eschew the incredibly time-consuming manual effort to animate single-bounce, indirect lighting in favor of 10 to 20 bounces simulated by the software. It’s responsible for environmental effects — stuff most audiences might take for granted, like when they see Baymax, the soft, vinyl robot featured in the film, illuminated from behind. That seemingly mundane lighting trick is no small feat; it required the use of a 55,000-core supercomputer spread across four geographic locations.

Disney Animation CTO Andy Hendrickson demonstrates Hyperion’s real-world lighting simulation.

“This movie’s so complex that humans couldn’t actually handle the complexity. We have to come up with automated systems,” says Hendrickson. To manage that cluster and the 400,000-plus computations it processes per day (roughly about 1.1 million computational hours), his team created software called Coda, which treats the four render farms like a single supercomputer. If one or more of those thousands of jobs fails, Coda alerts the appropriate staffers via an iPhone app.

To put the enormity of this computational effort into perspective, Hendrickson says that Hyperion “could render Tangled from scratch every 10 days.”

If that doesn’t drive the power of Disney’s proprietary renderer home, then consider this: San Fransokyo contains around 83,000 buildings, 260,000 trees, 215,000 streetlights and 100,000 vehicles (plus thousands of crowd extras generated by a tool called Denizen). What’s more, all of the detail you see in the city is actually based off assessor data for lots and street layouts from the real San Francisco. As Visual Effects Supervisor Kyle Odermatt explains, animating a city that lively and massive simply would not have been possible with previous technology. “You couldn’t zoom all the way out [for a] wide shot down to just a single street level the way we’re able to,” he says.

“This movie’s so complex that humans couldn’t actually handle the complexity. We have to come up with automated systems,” says Hendrickson.

Beyond the supercomputer cluster and software tools devised to make the movie, Big Hero 6 leans heavily on cutting-edge technology for its visual majesty in one other way: its characters. Both Baymax, the aforementioned, lovable robot sidekick and the microbots, swarm-like mini-drones controlled by telepathy, are steeped in some very real scientific research. That decision to ground the world of Big Hero 6 in near-future technologies led Hall and co-director Chris Williams on research trips to MIT, Harvard and Carnegie Mellon in the US and even to Tokyo University in Japan.

A soft robotic arm developed by researchers at Carnegie Mellon University.

“You know, we try to look at, like, five to 10 years down the road at what was coming … It seems counterintuitive because in animation you can do anything, but it still has to be grounded in a believable world,” says Hall.

Indeed, there’s even a moment where supergenius lead character Hiro Hamada uses a 3D printer in his garage to create an outfit for Baymax. In discussing the scene, Roy Conli, the film’s producer, credits the “maker movement that’s going on right now.” He adds, “These kids are makers. So it’s a little bit the celebration of the nerd.”

To put the enormity of this computational effort into perspective, Hendrickson says that Hyperion “could render Tangled from scratch every 10 days.”

It was during a visit to Carnegie Mellon that Hall came across researcher Chris Atkeson, who’d been working in the field of inflatable, soft robotics; robots intended for the health care industry. Hall says Atkeson pleaded with him to “make a movie where the robot is not the villain.” But Atkeson didn’t have to do much convincing — Hall’s vision for Baymax meshed nicely with his research. He’d wanted a robot audiences hadn’t seen on screen before. Hall continues, “The minute I saw this [research], I knew that we had our huggable robot. I knew that we had found Baymax.”

The team also drew inspiration for Baymax from existing compassionate-care tech out of Japan. “They’re a little ahead of the curve,” Hall says. “I mean, [health care robots] are actually in practice in some of the hospitals in Japan. They’re not vinyl; they’re not Baymax. They’re plastic robotics.”

The high-tech city of San Fransokyo represents a mash-up of eastern and western culture.

Robotics research out of Carnegie Mellon also provided the basis for the unwitting pawns of the film: the Lego-like, mind-controlled microbots. Of course, the version we see in the film is a much more fantastical approach to the simple, water-walking bots Hall’s team glimpsed during their visit. That, coupled with a heavy dose of inspiration from swarm-drone tech, led to the insect-like creepiness of the microbots in the final film.

By design, the electromagnetic microbots move as if part of a chain: Each individual “link” travels from front to back to propel the swarm forward in a circuit-board-like pattern. On average, the visual effects team says there are about 20 million microbots onscreen in a given shot, and that level of complexity is where Hyperion once again comes crucially into play. Originally, however, the team didn’t think its full vision of the microbots would even be possible to render.

In a way, Big Hero 6 is a love letter to technology.

“We thought the technology would never actually be able to handle it happening in all of the shots,” explains Head of Effects Michael Kaschalk. “And to do that from shot to shot, that takes artists’ work to just be able to create the [lighting] cheat. But as Hyperion developed, and we actually built the system, we found that it was handling all of this data just fine. So we actually built the real thing.”

Hiro scans Baymax to create 3D-printed armor.

Though tech innovation clearly plays an important role in development at Disney Animation Studios, it’s not the sole guiding force for each film and, for that matter, neither is the story. The studio’s process is entirely collaborative. “We are looking for input from everybody that works here for storytelling … there’s no doubt that those ideas can rise up from anywhere to become a big piece or small piece of the story,” says Odermatt. There’s no one single source of motivation other than a love of research and functional design — key concepts imparted by Chief Creative Officer John Lasseter.

“The movie does celebrate science and technology in a way that we haven’t really done before.”

In a way, Big Hero 6 is a love letter to technology. It’s a fantasy film that gives audiences a knowing wink toward the robot-assisted near-future, as if to say, “This is exactly where you’re headed. And it’s coming soon.” Big Hero 6 also represents a perfect storm for Disney: The subject matter (makers and robotics) and setting (hyper-tech San Fransokyo) dovetailed with the economic feasibility of cutting-edge computational hardware (that massive render farm) and the development of advanced animation techniques (Hyperion). It’s a film for, by and from lovers of technology.

That Big Hero 6 has a technological heart and soul is not lost on Hall. In fact, he’s keenly aware of this. “The movie does celebrate science and technology in a way that we haven’t really done before.”

[Image credit: Walt Disney Animation; Carnegie Mellon University (soft robotic arm)]















Source Article from http://www.engadget.com/2014/10/18/disney-big-hero-6/?ncid=rss_truncated

The Economics of Fuzz Testing with the Intel Edison

edison-cluster

The Intel Edison is an incredibly small and cheap x86 computing platform, and with that comes the obvious applications for robotics and wearable computing. [mz] had another idea: what if the Edison could do work that is usually done by workstations? Would it make economic sense to buy a handful of Edisons over a single quad-core Xeon system?

[mz] thought the Edison would be an ideal platform for fuzz testing, or sending random, automated data at a program or system to figure out if they’ll misbehave in interesting ways. After figuring out where to solder power and ground wires to boot an Edison without a breakout board, [mz] got to work benchmarking his fuzz testing setup.

Comparing the benchmarks of a fuzzing job running on the Edison and a few servers and workstations, calculations of cost-efficiency worked out well for this tiny x86 system on module. For parallelizable tasks, the Edison is about 8x less powerful than a reasonably modern server, but it’s also about 5-8x cheaper than a comparable desktop machine. Although renting a server is by far the more economic solution for getting a lot of computing power easily, there are a few use cases where a cluster of Edisons in your pocket would make sense.

Source Article from http://feedproxy.google.com/~r/hackaday/LgoM/~3/BeZIxiIIScE/

Having recalled its Force fitness tracker earlier this year, Fitbit has said that it’s not going to

Having recalled its Force fitness tracker earlier this year, Fitbit has said that it’s not going to do the same for the Flex — despite some user reports of the same skin rash problems as dogged the Force. Rather, future Flex units will ship with a warning that the product contains nickel, a common allergen. [New York Times]

Source Article from http://feeds.gawker.com/~r/gizmodo/full/~3/Ltga0rGlDXI/having-recalled-its-force-fitness-tracker-earlier-this-1647945545