Who makes the most reliable hard drives?

blog-lifetime-by-drive-size

Backblaze is back again, this time with updated hard drive statistics and failure rates for all of 2016. Backblaze’s quarterly reports on HDD failure rates and statistics are the best data set we have for measuring drive reliability and performance, so let’s take a look at the full year and see who the winners and losers are.

Backblaze only includes hard drive models in its report if it has at least 45 drives of that type, and it currently has 72,100 hard drives in operation. The slideshow below explains and steps through each of Backblaze’s charts, with additional commentary and information. Each slide can be clicked to open a full-size version in a new window.

Backblaze has explained before that it can tolerate a relatively high failure rate before it starts avoiding drives altogether, but the company has been known to take that step (it stopped using a specific type of Seagate drive at one point due to unacceptably high failure rates). Current Seagate drives have been much better and the company’s 8TB drives are showing an excellent annualized failure rate.

Next, we’ve got something interesting — drive failure rates plotted against drive capacity.

The “stars” mark the average annualized failure rate for all of the hard drives for each year.

The giant peak in 3TB drive failures was driven by the Seagate ST3000DM001, with its 26.72% failure rate. Backblaze actually took the unusual step of yanking the drives after they proved unreliable. With those drives retired, the 3GB failure rate falls back to normal.

One interesting bit of information in this graph is that drive failure rates don’t really shift much over time. The shifts we do see are as likely to be caused by Backblaze’s perpetual rotation between various manufacturers as old drives are retired and new models become available. Higher capacity drives aren’t failing at statistically different rates than older, smaller drives, implying that buyers don’t need to worry that bigger drives are more prone to failure.

The usual grain of salt

As always, Backblaze’s data sets should be taken as a representative sample of how drives perform in this specific workload. Backblaze’s buying practices prioritize low cost drives over any other type, and they don’t buy the enterprise drives that WD, Seagate, and other manufacturers position specifically for these kinds of deployments. Whether or not this has any impact on consumer drive failure rates isn’t known — HDD manufacturers advertise their enterprise hardware as having gone through additional validation and being designed specifically for high-vibration environments, but there are few studies on whether or not these claims result in meaningfully better performance or reliability.

 

Backblaze’s operating environment has little in common with a consumer desktop or laptop, and may not cleanly match the failure rates we would see in these products. The company readily acknowledges these limitations, but continues to provide its data on the grounds that having some information about real-world failure rates and how long hard drives live for is better than having none at all. We agree. Readers often ask which hard drive brands are the most reliable, but this information is extremely difficult to come by. Most studies of real-world failure rates don’t name brands or manufacturers, which limits their real-world applicability.

 

[Source:- Extremetech]

Researchers from the UGR develop a new software which adapts medical technology to see the interior of a sculpture

Researchers from the UGR develop a new software which adapts medical technology to see the interior of a sculpture

A student at the University of Granada (UGR) has designed software that adapts current medical technology to analyze the interior of sculptures. It’s a tool to see the interior without damaging wood carvings, and it has been designed for the restoration and conservation of the sculptural heritage.

Francisco Javier Melero, professor of Languages and Computer Systems at the University of Granada and director of the project, says that the new software simplifies medical technology and adapts it to the needs of restorers working with wood carvings.

The software, called 3DCurator, has a specialized viewfinder that uses computed tomography in the field of restoration and conservation of sculptural heritage. It adapts the medical CT to restoration and it displays the 3-D image of the carving with which it is going to work.

Replacing the traditional X-rays for this system allows restorers to examine the interior of a statue without the problem of overlapping information presented by older techniques, and reveals its internal structure, the age of the wood from which it was made, and possible additions.

“The software that carries out this task has been simplified in order to allow any restorer to easily use it. You can even customize some functions, and it allows the restorers to use the latest medical technology used to study pathologies and apply it to constructive techniques of wood sculptures,” says professor Melero.

 

This system, which can be downloaded for free from www.3dcurator.es, visualizes the hidden information of a carving, verifies if it contains metallic elements, identifies problems of xylophages like termites and the tunnel they make, and detects new plasters or polychrome paintings added later, especially on the original finishes.

The main developer of 3DCurator was Francisco Javier Bolívar, who stressed that the tool will mean a notable breakthrough in the field of conservation and restoration of cultural assets and the analysis of works of art by experts in Art History.

Professor Melero explains that this new tool has already been used to examine two sculptures owned by the University of Granada: the statues of San Juan Evangelista, from the 16th century, and an Immaculate from the 17th century, which can be virtually examined at the Virtual Heritage Site Of the Andalusian Universities (patrimonio3d.ugr.es/).

 

 

[Source:- Phys.org]

 

The Nintendo Switch will need its smartphone app for online matchmaking

The Nintendo Switch companion app is fast turning into a pretty essential part of the Switch.

As well as the previously announced news that you’ll need to use the app in order to enable voice-chat on the console, in a recent interview Nintendo of America’s President Reggie Fils-Aime suggested that the app would be used for a lot more besides voice chat.

In fact, the app’s functionality actually goes as far as enabling matchmaking and allowing you to create lobbies, suggesting that your online options are going to be pretty slim without your smartphone.

Smart (phone) justifications

Fils-Aime justified the decision to rely on the app for voice chat by saying that most people will have a headset that connects to their phone on them at all times.

As such using the phone for voice chat makes sense, as it means you don’t have to carry around an extra Switch-specific headset.

But while these justifications make a certain amount of sense for using the console while on the go, the same can’t be said for docked play, where people are used to having a dedicated headset and a console that can handle everything without needing accessories.

Fils-Aime’s use of the word ‘hotspot’ also suggests that Nintendo expects people to tether their console to their phone to get online while on the go, which might prove challenging for anyone with a limited amount of data.

It’s beginning to feel as though in its quest to make a hybrid console, the Nintendo Switch is fast becoming a device that has limitations in both form-factors.

We’ve contacted Nintendo to ask for clarification on what exactly the mobile app will enable, and what form of online play will be possible without the app.

 

[Source:- Techrader]

 

 

Snapchat is now using the third-party ad targeting it once called ‘creepy’

Snapchat is now accessing its users’ offline purchase data to improve the targeting of its ads, despite its CEO having previously deemed this kind of advertising “creepy.”

Following in the footsteps of tech and social media giants such as Facebook, Twitter, and Google, Snap Inc has partnered with a third party offline data provider called Oracle Data Cloud according to the Wall Street Journal.

This partnership will allow Snapchat advertisers to access data about what users buy offline in order to more accurately target ads.

Snapchat gets specific

Now rather than seeing generally less invasive advertisements appear on Snapchat which have a broad consumer appeal, you’re more likely to see ads that make you think “how did they know?” as you’ll now be assigned a specific consumer demographic such as “consumer tech purchaser.”

This decision shows the company is taking its growth seriously as it’s a different approach CEO Evan Spiegel laid out in June 2015. Back then, Spiegel stated his distaste for such personalized advertising saying “I got an ad this morning for something I was thinking about buying yesterday, and it’s really annoying. We care about not being creepy. That’s something that’s really important to us.”

Now, however, Snap Inc has to do all it can to guarantee that its stock is worth buying when it goes public later this year. Such an advertising approach is a good way to do so because it should make Snapchat a more attractive option to advertisers as targeted adverts are more likely to earn more per view.

Fortunately, if this kind of advertising doesn’t sit well with you whether because you consider it invasive or because you’re just incredibly susceptible, Snapchat is giving its users the ability to opt out. It’s already started rolling out the changed adverts so you’ll be able to change it now.

To do so, simply go into the settings section within the Snapchat app, go to Manage Preferences, select Ad Preferences and switch off the Snap Audience Match function.

 

 

[Source:- Techrader]

 

An app to crack the teen exercise code

An app to crack the teen exercise code

Pokémon GO has motivated its players to walk 2.8 billion miles. Now, a new mobile game from UVM researchers aims to encourage teens to exercise with similar virtual rewards.

The game, called “Camp Conquer,” is the brainchild of co-principal investigators Lizzy Pope, assistant professor in the Department of Nutrition and Food Science, and Bernice Garnett, assistant professor of education in the College of Education and Social Services, both of the University of Vermont. The project is one of the first in the area of gamification and obesity, and will test launch with 100 Burlington High School students this month.

Here’s how it works: Real-world physical activity, tracked by a Fitbit, translates into immediate rewards in the game, a capture-the-flag-style water balloon battle with fun, summer camp flair. Every step a player takes in the real world improves their strength, speed, and accuracy in the game. “For every hundred steps, you also get currency in the game to buy items like a special water balloon launcher or new sneakers for your avatar,” says Pope.

Helping Schools Meet Mandates

In 2014, Vermont established a requirement for students to get 30 minutes of physical activity during the school day (in addition to P.E. classes), a mark Pope says schools are struggling to hit. And it’s not just Vermont; according to the CDC, only 27 percent of high school students nationwide hit recommended activity goals, and 34 percent of US teens are overweight or obese.

Camp Conquer is a promising solution. The idea struck after Pope and Garnett visited Burlington High School, where they saw students playing lots of games on school-provided Chromebook laptops. Pope and Garnett approached Kerry Swift in UVM’s Office of Technology Commercialization for help. “I thought, if we’re going to make a game, it’s going to be legit,” says Pope.

Where Public Meets Private

The team is working with GameTheory, a local design studio whose mission is to create games that drive change. Pope says forming these types of UVM/private business partnerships to create technology that can be commercialized is the whole point of UVMVentures Funds, which partially support this project.

A key result of this public/private partnership, and of the cross-departmental collaboration between Pope and Garnett, was a methodology shift. Pope says it’s less common for health behavior researchers to involve their target demographic in “intervention design.” But Garnett, who has experience in community-based participatory research, and GameTheory, which commonly utilizes customer research, helped shift this. “Putting the experience of Bernice and GameTheory together, we came up with student focus groups to determine when they’re active, why they’re not, and what types of games they like to play,” says Pope. She believes this student input has Camp Conquer poised for success. “It gave us a lot of good insight, and created game champions.”

What does success look like? Pope says in her eyes, “it’s all about exciting kids to move more.” But another important aspect is the eventual commercialization of the app. “It could be widely disseminated at a very low cost. You could imagine a whole school district adopting the app,” says Pope. She expects that if the January test shows promise, GameTheory will take the game forward into the marketplace, and continue to update and improve it. “There’s definitely potential,” says Pope.

[Source:- Phys.org]

The SIM-unlocked Alcatel IDOL 4S quietly goes on sale through the Microsoft Store

Image result for The SIM-unlocked Alcatel IDOL 4S quietly goes on sale through the Microsoft Store

Looks like speculation that Alcatel’s Idol 4S running Windows 10 Mobile going carrier-unlocked (GSM) after a T-Mobile exclusivity ended were true. As spotted on MSPU Microsoft has begun to make the rather powerful – and impressive – Windows 10 Mobile phone available for purchase in the US through their store.

Asking price is still the same $470, which includes the VR goggle package and 21MP rear camera.

Alcatel Idol 4S with Windows 10 Specs

CPU Snapdragon 820 | Quad Core CPU @2.15 GHz
Display 5.5-inch FHD AMOLED
Dragontrail 2.5D Glass
Memory 64GB ROM
4GB RAM
microSD
Camera 21 MP rear camera
8 MP front-facing camera
Battery 3,000 mAh
Quick Charge 3.0
420Hrs Standby
15Hrs Talk
Continuum Yes
VR Yes
Windows Hello Yes (Fingerprint)
Audio Dual speakers with Hi-Fi surround sound
Dimensions 153.9 x 75.4 x 6.99 mm
Weight 152g
HD Voice Yes
VoLTE Yes
Wi-Fi 802.11 a/b/g/n/ac
Wi-Fi Calling 1.0
Bluetooth BT 4.1
A2DP, OPP, HFP, AVRCP, PBAP

The rest of the specifications and color (‘Halo Gold’) are all the same as well. In fact, it’s likely the same device as our review unit, which was unlocked as well and worked brilliantly on AT&T with no issue.

Microsoft notes that the unlocked version should work on AT&T, T-Mobile, H20, Straight Talk, Cricket Wireless, MetroPCS, and select prepaid carriers.

 

[Source:- Windowscentral]

 

AI tools came out of the lab in 2016

Roboy angry robot

You shouldn’t anthropomorphize computers: They don’t like it.

That joke is at least as old as Deep Blue’s 1997 victory over then world chess champion Garry Kasparov, but even with the great strides made in the field of artificial intelligence over that time, we’re still not much closer to having to worry about computers’ feelings.

Computers can analyze the sentiments we express in social media, and project expressions on the face of robots to make us believe they are happy or angry, but no one seriously believes, yet, that they “have” feelings, that they can experience them.

Other areas of AI, on the other hand, have seen some impressive advances in both hardware and software in just the last 12 months.

Deep Blue was a world-class chess opponent — and also one that didn’t gloat when it won, or go off in a huff if it lost.

Until this year, though, computers were no match for a human at another board game, Go. That all changed in March when AlphaGo, developed by Google subsidiary DeepMind, beat Lee Sedol, then the world’s strongest Go player, 4-1 in a five-match tournament.

AlphaGo’s secret weapon was a technique called reinforcement learning, where a program figures out for itself which actions bring it closer to its goal, and reinforces those behaviors, without the need to be taught by a person which steps are correct. That meant that it could play repeatedly against itself and gradually learn which strategies fared better.

Reinforcement learning techniques have been around for decades, too, but it’s only recently that computers have had sufficient processing power (to test each possible path in turn) and memory (to remember which steps led to the goal) to play a high-level game of Go at a competitive speed.

Better performing hardware has moved AI forward in other ways too.

In May, Google revealed its TPU (Tensor Processing Unit), a hardware accelerator for its TensorFlow deep learning algorithm. The ASICs (application-specific integrated circuit) can execute the types of calculations used in machine learning much faster and using less power than even GPUs, and Google has installed several thousand of them in its server racks in the slots previously reserved for hard drives.

The TPU, it turns out, was one of the things that made AlphaGo so fast, but Google has also used the chip to accelerate mapping and navigation functions in Street View and to improve search results with a new AI tool called RankBrain.

Google is keeping its TPU to itself for now, but others are releasing hardware tuned for AI applications. Microsoft, for example, has equipped some of its Azure servers with FPGAs (field-programmable gate arrays) to accelerate certain machine learning functions, while IBM is targeting similar applications with a range of PowerAI servers that use custom hardware to link its Power CPUs with Nvidia GPUs.

For businesses that want to deploy cutting-edge AI technologies without developing everything from scratch themselves, easy access to high-performance hardware is a start, but not enough. Cloud operators recognize that, and are also offering AI software as a service. Amazon Web Services and Microsoft’s Azure have both added machine learning APIs, while IBM is building a business around cloud access to its Watson AI.

The fact that these hardware and software tools are cloud-based will help AI systems in other ways too.

Being able to store and process enormous volumes of data is only useful to the AI that has access to vast quantities of data from which to learn — data such as that collected and delivered by cloud services, for example, whether its information about the weather, mail order deliveries, requests for rides or peoples’ tweets.

Access to all that raw data, rather than the minute subset, processed and labelled by human trainers, that was available to previous generations of AIs, is one of the biggest factors transforming AI research today, according to a Stanford University study of the next 100 years in AI.

And while having computers watch everything we do, online and off, in order to learn how to work with us might seem creepy, it’s really only in our minds. The computers don’t feel anything. Yet.

 

[Source:- JW]

 

Transforming, self-learning software could help save the planet

Image result for Transforming, self-learning software could help save the planet

Artificially intelligent computer software that can learn, adapt and rebuild itself in real-time could help combat climate change.

Researchers at Lancaster University’s Data Science Institute have developed a software system that can for the first time rapidly self-assemble into the most efficient form without needing humans to tell it what to do.

The system — called REx — is being developed with vast energy-hungry data centres in mind. By being able to rapidly adjust to optimally deal with a huge multitude of tasks, servers controlled by REx would need to do less processing, therefore consuming less energy.

REx works using ‘micro-variation’ — where a large library of building blocks of software components (such as memory caches, and different forms of search and sort algorithms) can be selected and assembled automatically in response to the task at hand.

“Everything is learned by the live system, assembling the required components and continually assessing their effectiveness in the situations to which the system is subjected,” said Dr Barry Porter, lecturer at Lancaster University’s School of Computing and Communications. “Each component is sufficiently small that it is easy to create natural behavioural variation. By autonomously assembling systems from these micro-variations we then see REx create software designs that are automatically formed to deal with their task.

“As we use connected devices on a more frequent basis, and as we move into the era of the Internet of Things, the volume of data that needs to be processed and distributed is rapidly growing. This is causing a significant demand for energy through millions of servers at data centres. An automated system like REx, able to find the best performance in any conditions, could offer a way to significantly reduce this energy demand,” Dr Porter added.

In addition, as modern software systems are increasingly complex — consisting of millions of lines of code — they need to be maintained by large teams of software developers at significant cost. It is broadly acknowledged that this level of complexity and management is unsustainable. As well as saving energy in data centres, self-assembling software models could also have significant advantages by improving our ability to develop and maintain increasingly complex software systems for a wide range of domains, including operating systems and Internet infrastructure.

REx is built using three complementary layers. At the base level a novel component-based programming language called Dana enables the system to find, select and rapidly adapt the building blocks of software. A perception, assembly and learning framework (PAL) then configures and perceives the behaviour of the selected components, and an online learning process learns the best software compositions in real-time by taking advantage of statistical learning methods known as ‘linear bandit models’.

The work is presented in the paper ‘REx: A Development Platform and Online Learning Approach for Runtime Emergent Software Systems’ at the conference ‘OSDI ’16 12th USENIX Symposium on Operating Systems Design and Implementation’. The research has been partially supported by the Engineering and Physical Sciences Research Council (EPSRC), and also a PhD scholarship of Brazil.

The next steps of this research will look at the automated creation of new software components for use by these systems and will also strive to increase automation even further to make software systems an active part of their own development teams, providing live feedback and suggestions to human programmers.

[Source:- SD]

Snowflake now offers data warehousing to the masses

Snowflake now offers data warehousing to the masses

Snowflake, the cloud-based data warehouse solution co-founded by Microsoft alumnus Bob Muglia, is lowering storage prices and adding a self-service option, meaning prospective customers can open an account with nothing more than a credit card.

These changes also raise an intriguing question: How long can a service like Snowflake expect to reside on Amazon, which itself offers services that are more or less in direct competition — and where the raw cost of storage undercuts Snowflake’s own pricing for same?

Open to the public

The self-service option, called Snowflake On Demand, is a change from Snowflake’s original sales model. Rather than calling a sales representative to set up an account, Snowflake users can now provision services themselves with no more effort than would be needed to spin up an AWS EC2 instance.

In a phone interview, Muglia discussed how the reason for only just now transitioning to this model was more technical than anything else. Before self-service could be offered, Snowflake had to put protections into place to ensure that both the service itself and its customers could be protected from everything from malice (denial-of-service attacks) to incompetence (honest customers submitting massively malformed queries).

“We wanted to make sure we had appropriately protected the system,” Muglia said, “before we opened it up to anyone, anywhere.”

This effort was further complicated by Snowflake’s relative lack of hard usage limits, which Muglia characterized as being one of its major standout features. “There is no limit to the number of tables you can create,” Muglia said, but he further pointed out that Snowflake has to strike a balance between what it can offer any one customer and protecting the integrity of the service as a whole.

“We get some crazy SQL queries coming in our direction,” Muglia said, “and regardless of what comes in, we need to continue to perform appropriately for that customer as well as other customers. We see SQL queries that are a megabyte in size — the query statements [themselves] are a megabyte in size.” (Many such queries are poorly formed, auto-generated SQL, Muglia claimed.)

Fewer costs, more competition

The other major change is a reduction in storage pricing for the service — $30/TB/month for capacity storage, $50/TB/month for on-demand storage, and uncompressed storage at $10/TB/month.

It’s enough of a reduction in price that Snowflake will be unable to rely on storage costs as a revenue source, since those prices barely pay for the use of Amazon’s services as a storage provider. But Muglia is confident Snowflake is profitable enough overall that such a move won’t impact the company’s bottom line.

“We did the data modeling on this,” said Muglia, “and our margins were always lower on storage than on compute running queries.”

According to the studies Snowflake performed, “when customers put more data into Snowflake, they run more queries…. In almost every scenario you can imagine, they were very much revenue-positive and gross-margin neutral, because people run more queries.”

The long-term implications for Snowflake continuing to reside on Amazon aren’t clear yet, especially since Amazon might well be able to undercut Snowflake by directly offering competitive services.

Muglia, though, is confident that Snowflake’s offering is singular enough to stave off competition for a good long time, and is ready to change things up if need be. “We always look into the possibility of moving to other cloud infrastructures,” Muglia said, “although we don’t have plans to do it right now.”

He also noted that Snowflake competes with Amazon and Redshift right now, but “we have a very different shape of product relative to Redshift…. Snowflake is storing multiple petabytes of data and is able to run hundreds of simultaneous concurrent queries. Redshift can’t do that; no other product can do that. It’s that differentiation that allows to effective compete with Amazon, and for that matter Google and Microsoft and Oracle and Teradata.”

 

 

[Source:- IW]

Azure brings SQL Server Analysis Services to the cloud

Azure brings SQL Server Analysis Services to the cloud

SQL Server Analysis Services, one of the key features of Microsoft’s relational database enterprise offering, is going to the cloud. The company announced Tuesday that it’s launching the public beta of Azure Analysis Services, which gives users cloud-based access to semantic data modeling tools.

The news is part of a host of announcements the company is making at the Professional Association for SQL Server Summit in Seattle this week. On top of the new cloud service, Microsoft also released new tools for migrating to the latest version of SQL Server and an expanded free trial for Azure SQL Data Warehouse. On the hardware side, the company revealed new reference architecture for using SQL Server 2016 with active data sets of up to 145TB.

The actions are all part of Microsoft’s continued investment in the company’s relational database product at a time when it’s trying to get customers to move to its cloud.

Azure Analysis Services is designed to help companies get the benefits of cloud processing for semantic data modeling, while still being able to glean insights from data that’s stored either on-premises or in the public cloud. It’s compatible with databases like SQL Server, Azure SQL Database, Azure SQL Data Warehouse, Oracle and Teradata. Customers that already use SQL Server Analysis Services in their private data centers can take the models from that deployment and move them to Azure, too.

One of the key benefits to using Azure Analysis Services is that it’s a fully managed service. Microsoft deals with the work of figuring out the compute resources underpinning the functionality, and users can just focus on the data.

Like its on-premises predecessor, Azure Analysis Services integrates with Microsoft’s Power BI data visualization tools, providing additional modeling capabilities that go beyond what that service can offer. Azure AS can also connect to other business intelligence software, like Tableau.

Microsoft also is making it easier to migrate from an older version of its database software to SQL Server 2016.  To help companies evaluate the difference between their old version of SQL Server and the latest release, Microsoft has launched the  Database Experimentation Assistant.

Customers can use the assistant to run experiments across different versions of the software, so they can see what if any benefits they’ll get out of the upgrade process while also helping to reduce risk. The Data Migration Assistant, which is supposed to help move workloads, is also being upgraded.

For companies that have large amounts of data they want to store in a cloud database, Microsoft is offering an expanded free trial of Azure SQL Data Warehouse. Users can sign up starting on Tuesday, and get a free month of use. Those customers who want to give it a shot will have to move quickly, though: Microsoft is only taking trial sign-ups until December 31.

Microsoft Corporate Vice President Joseph Sirosh said in an interview that the change to the Azure SQL Data Warehouse trial was necessary because setting up the system to work with actual data warehouse workloads would blow through the typical Azure free trial. Giving people additional capacity to work with should let them have more of an opportunity to test the service before committing to a large deployment.

All of this SQL news comes a little more than a month before AWS Re:Invent, Amazon’s big cloud conference in Las Vegas. It’s likely that we’ll see Amazon unveil some new database products at that event, continuing the ongoing cycle of competition among database vendors in the cloud.

 

 

[Source:- IW]