Tim Sweeney is positively steam-ed about Microsoft’s Windows Cloud operating system

Image result for Tim,Sweeney,is,positively,steam-ed,about,Microsoft’s,Windows,Cloud,operating,system

Yesterday, we reported on Windows Cloud — a new version of Microsoft’s Windows 10 that’s supposedly in the works. Windows Cloud would be limited to applications that are available through the Windows Store and is widely believed to be a play for the education market, where Chromebooks are currently popular.

Tim Sweeney, the founder of Epic and lead developer on the Unreal Engine, has been a harsh critic of Microsoft and its Windows Store before. He wasted no time launching a blistering tirade against this new variant of the operating system, before Microsoft has even had a chance to launch the thing.

With all respect to Tim, I think he’s wrong on this for several reasons. First, the idea that the Windows Store is going to crush Steam is simply farcical. There is no way for Microsoft to simply disallow Steam or other applications from running in mainstream Windows without completely breaking Win32 compatibility in its own operating system. Smartphone manufacturers were able to introduce the concept of app stores and walled gardens early on. Fortune 500 companies, gamers, enthusiasts, and computer users in general would never accept an OS that refused to run Win32 applications.

The second reason the Windows Store is never going to crush Steam is that the Windows Store is, generally speaking, a wasteland where software goes to die. The mainstream games that have debuted on that platform have generally been poor deals compared with what’s available on other platforms (like Steam). There’s little sign Microsoft is going to change this anytime soon, and until it does, Steam’s near-monopoly on PC game distribution is safe.

Third, if Microsoft is positioning this as a play against Chrome OS, Windows Cloud isn’t going to debut on high-end systems that are gaming-capable in the first place. This is a play aimed at low-end ARM or x86 machines with minimum graphics and CPU performance. In that space, a locked-down system is a more secure system. That’s a feature, not a bug, if your goal is to build systems that won’t need constant IT service from trojans, malware, and bugs.

Like Sweeney, I value the openness and capability of the PC ecosystem — but I also recognize that there are environments and situations where that openness is a risk with substantial downside and little benefit. Specialized educational systems for low-end markets are not a beachhead aimed at destroying Steam. They’re a rear-guard action aimed at protecting Microsoft’s educational market share from an encroaching Google.



[Source:- Extremetech]

Software system labels coral reef images in record time

Computer scientists at the University of California San Diego have released a new version of a software system that processes images from the world’s coral reefs anywhere between 10 to 100 times faster than processing the data by hand.

This is possible because the new version of the system, dubbed CoralNet Beta, includes deep learning technology, which uses vast networks of artificial neurons to learn to interpret image content and to process data.

CoralNet Beta cuts down the time needed to go through a typical 1200-image diver survey of the ocean’s floor from 10 weeks to just one week—with the same amount of accuracy. Coral ecologists and government organizations, such as the National Oceanographic and Atmospheric Administration, also use CoralNet to automatically process images from autonomous underwater vehicles. The system allows researchers to label different types of coral and whether they’ve been bleached, different types of invertebrates, different types of algae—and more. In all, over 2200 labels are available on the site.

“This will allow researchers to better understand the changes and degradation happening in coral reefs,” said David Kriegman, a computer science professor at the Jacobs School of Engineering at UC San Diego and one of the project’s advisers.

The Beta version of the system runs on a deep neural network with more than 147 million neural connections. “We expect users to see a very significant improvement in automated annotation performance compared to the previous version, allowing more images to be annotated quicker—meaning more time for field deployment and higher-level data analysis,” said Oscar Beijbom, a UC San Diego Ph.D. alumnus and the project’s manager and founder of CoralNet.

He created CoralNet Alpha in 2012 to help label images gathered by oceanographers around the world. Since then, more than 500 users, from research groups, to nonprofits, to government organizations, have uploaded more than 350,000 survey images to the system. Researchers used CoralNet Alpha to label more than five million data points across these images using a tool to label random points within an image designed by UC San Diego alumnus Stephen Chen, the project’s lead developer.

“Over time, news of the site spread by word of mouth, and suddenly it was used all over the world,” said Beijbom.

Other updates in the Beta version include an improved user interface, web security and scalable hosting at Amazon Web Services.

[Source:- Phys.org]

Microsoft releases the latest update to Analytics Platform System

Image result for Microsoft releases the latest update to Analytics Platform System

Microsoft is pleased to announce that the appliance update, Analytics Platform System (APS) 2016, has been released to manufacturing and is now generally available. APS is Microsoft’s scale-out Massively Parallel Processing fully integrated system for data warehouse specific workloads.

This appliance update builds on the SQL Server2016 release as a foundation to bring you many value-added features. APS 2016 offers additional language coverage to support migrations from SQL Server and other platforms. It also features improved security for hybrid scenarios and the latest security and bug fixes through new firmware and driver updates.

SQL Server 2016

APS 2016 runs on the latest SQL Server 2016 release and now uses the default database compatibility level 130 which can support improved query performance. SQL Server 2016 allows APS to offer features such as secondary index support for CCI tables and PolyBase Kerberos support.


APS 2016 supports a broader set of T-SQL compatibility, including support for wider rows and a large number of rows, VARCHAR(MAX), NVARCHAR(MAX) and VARBINARY(MAX). For greater analysis flexibility, APS supports full window frame syntax for ROWS or RANGE and additional windowing functions like FIRST_VALUE, LAST_VALUE, CUME_DIST and  PERCENT_RANK. Additional functions like NEWID() and RAND() work with new data type support for UNIQUEIDENTIFIER and NUMERIC. For the full set of supported T-SQL, please visit the online documentation.

PolyBase/Hadoop enhancements

PolyBase now supports the latest Hortonworks HDP 2.4 and HDP 2.5. This appliance update provides enhanced security through Kerberos support via database-scoped credentials and credential support with Azure Storage Blobs for added security across big data analysis.

Install and upgrade enhancements

Hardware architecture updates bring the latest generation processor support (Broadwell), DDR4 DIMMs, and improved DIMM throughput – these will ship with hardware purchased from HPE, Dell or Quanta. This update offers customers an enhanced upgrade and deployment experience on account of pre-packaging of certain Windows Server updates, hotfixes, and an installer that previously required an on-site download.

APS 2016 also supports Fully Qualified Domain Name support, making it possible to setup a domain trust to the appliance. It also ships with the latest firmware/driver updates containing security updates and fixes.

Flexibility of choice with Microsoft’s data warehouse portfolio

The latest APS update is an addition to already existing data warehouse portfolio from Microsoft, covering a range of technology and deployment options that help customers get to insights faster. Customers exploring data warehouse products can also consider SQL Server with Fast Track for Data Warehouse or Azure SQL Data Warehouse, a cloud based fully managed service.


[Source:- Technet]

System predicts 85 percent of cyber-attacks using input from human experts

System predicts 85 percent of cyber-attacks using input from human experts

Today’s security systems usually fall into one of two categories: human or machine. So-called “analyst-driven solutions” rely on rules created by living experts and therefore miss any attacks that don’t match the rules. Meanwhile, today’s machine-learning approaches rely on “anomaly detection,” which tends to trigger false positives that both create distrust of the system and end up having to be investigated by humans, anyway.

But what if there were a solution that could merge those two worlds? What would it look like?

In a new paper, researchers from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) and the machine-learning startup PatternEx demonstrate an artificial intelligence platform called AI2 that predicts cyber-attacks significantly better than existing systems by continuously incorporating input from human experts. (The name comes from merging artificial intelligence with what the researchers call “analyst intuition.”)

The team showed that AI2 can detect 85 percent of attacks, which is roughly three times better than previous benchmarks, while also reducing the number of false positives by a factor of 5. The system was tested on 3.6 billion pieces of data known as “log lines,” which were generated by millions of users over a period of three months.

To predict attacks, AI2 combs through data and detects suspicious activity by clustering the data into meaningful patterns using unsupervised machine-learning. It then presents this activity to human analysts who confirm which events are actual attacks, and incorporates that feedback into its models for the next set of data.

“You can think about the system as a virtual analyst,” says CSAIL research scientist Kalyan Veeramachaneni, who developed AI2 with Ignacio Arnaldo, a chief data scientist at PatternEx and a former CSAIL postdoc. “It continuously generates new models that it can refine in as little as a few hours, meaning it can improve its detection rates significantly and rapidly.”

Veeramachaneni presented a paper about the system at last week’s IEEE International Conference on Big Data Security in New York City.

Creating cybersecurity systems that merge human- and computer-based approaches is tricky, partly because of the challenge of manually labeling cybersecurity data for the algorithms.

For example, let’s say you want to develop a computer-vision algorithm that can identify objects with high accuracy. Labeling data for that is simple: Just enlist a few human volunteers to label photos as either “objects” or “non-objects,” and feed that data into the algorithm.

But for a cybersecurity task, the average person on a crowdsourcing site like Amazon Mechanical Turk simply doesn’t have the skillset to apply labels like “DDOS” or “exfiltration attacks,” says Veeramachaneni. “You need security experts.”

That opens up another problem: Experts are busy, and they can’t spend all day reviewing reams of data that have been flagged as suspicious. Companies have been known to give up on platforms that are too much work, so an effective machine-learning system has to be able to improve itself without overwhelming its human overlords.

AI2’s secret weapon is that it fuses together three different unsupervised-learning methods, and then shows the top events to analysts for them to label. It then builds a supervised model that it can constantly refine through what the team calls a “continuous active learning system.”

Specifically, on day one of its training, AI2 picks the 200 most abnormal events and gives them to the expert. As it improves over time, it identifies more and more of the events as actual attacks, meaning that in a matter of days the analyst may only be looking at 30 or 40 events a day.

“This paper brings together the strengths of analyst intuition and machine learning, and ultimately drives down both false positives and false negatives,” says Nitesh Chawla, the Frank M. Freimann Professor of Computer Science at the University of Notre Dame. “This research has the potential to become a line of defense against attacks such as fraud, service abuse and account takeover, which are major challenges faced by consumer-facing systems.”

The team says that AI2 can scale to billions of log lines per day, transforming the pieces of data on a minute-by-minute basis into different “features”, or discrete types of behavior that are eventually deemed “normal” or “abnormal.”

“The more attacks the system detects, the more analyst feedback it receives, which, in turn, improves the accuracy of future predictions,” Veeramachaneni says. “That human-machine interaction creates a beautiful, cascading effect.”
[Source:- Phys.org]

Monitor System Stats, CPU Temp, Fan Speed in Mac Notification Center

System monitoring in Notification Center widgets for Mac OS X

Many Mac users like to keep a watchful eye on their system stats, including processor utilization, memory usage, disk activity, network usage, CPU temperature, fan speed, and perhaps battery stats. The Activity Monitor Dock icon offers one way to do this, but it’s on a limited basis, so if you’d rather see all kinds of system resource activity in a single control panel you may appreciate these two free Notification Center widgets for Mac OS X.

The first is called Monit, and once added to Notification Center it offers a means of quickly seeing an overview of CPU activity, memory usage, disk activity, battery, and network activity. You can then click on any of the little activity icons to get further information about each.

Watching system resource usage in Monit for Mac Notification Center

The second utility is called Fanny, and it keeps an eye on fan speed and CPU temperature of the Mac, also within Notification Center. This tool is likely most useful for Mac laptop users but many desktop users like to know what their fan is doing and what temperature the CPU is running.

Watching fan speed activity in Notification Center mac osx

Both of these utilities are installed as usual within Notification Center on the Mac, and after you have opened the individual app you can add the widget to Notification Center by opening the control panel, clicking on “Edit”, then adding the widgets and orientating them within the Notification Center panel as you see fit. You can also uninstall them at any time through the same Edit section of Notification Center.

System Activity monitoring from Notification Center

These widgets are purely for monitoring general statistics and resource usage, there is no actionable PID list, so if you’re expecting to take action on a CPU hog by killing the app it won’t be done here and you’d need to rely on one of the various methods of force quitting a Mac app.

Do keep in mind that system activity monitoring uses a small amount of CPU itself, so if you’re really pinching for processor or resources you may not want to have these type of widgets going at all. And if you’re not the type to want to install third party tools or utilities, the top command line tool and Activity Monitor app can offer similar functionality without any add-ons in Mac OS X, which is great if you find Notification Center alerts to be annoying and the whole accompanying widget thing to be a nuisance or useless.


[Source:- OSxdaily]