Digital Offers: Never forget a password again for $15

What do Facebook, email, and your online banking information all have in common? They are all secured behind a log in username and password. Assuming you use a different password for each service, that end’s up being a lot of passwords to memorize, and chances are you will forget at least one of them at some point.

There are very few things in the world that are more frustrating than forgetting your password and it usually leaves you pounding on your keyboard or throwing your phone across the room. Window Central Offers can help you never forget a password ever again.


True Key by Intel Security is an award-winning password managing software that will allow you to improve your security and prevent you from forgetting your login information.

True Key uses biometrics to make you the password! You can use your fingerprints, eyes, or even your face to access your favorite websites and accounts, and right now Windows Central Offers can offer you a one-year license to True Key for only $15.99.

Just look at some of the other features you can get with rue Key by Intel Security:

  • Access from any device; True Key syncs to your phone, tablet ,and computer.
  • Verify access to the app with your face, fingerprint, or via devices you trust for total security.
  • Store and manage up to 10,000 passwords securely in the True Key app, which is accessible only by you via devices you’ve approved.
  • Sync passwords automatically to your phones, tablets, and computers for easy access on any approved device.

Don’t ever lose your password again. True Key by Intel Security will protect your security and give you peace of mind, and right now we can offer it to you for only $15.99.




[Source:- Windowscentral]

New MacBook 2016 release date, price, specs rumours UK: No 12-inch MacBook announced alongside new MacBook Pro 2016 – now expected March 2017

New MacBook

s a 13in MacBook going to launch in 2016? When will the 2017 12-inch MacBook be released? What can I expect from the next 12-inch MacBook in terms of tech specs? How much will the 2017 12-inch MacBook cost? Will the 12-inch MacBook replace the MacBook Air range?

Apple only released the 2016 variant of the 12in MacBook back in April 2016, but we’re already looking to the future and what we could expect from next year’s model, the 2017 12in MacBook. Here, we sift through the latest rumours surrounding the 2017 MacBook and also our personal predictions based on previous Apple events, and knowledge of the company.

Those of you that want to find out more about the current 12in MacBook released in April 2016 can take a look at our 12in MacBook review, which covers everything from pricing to performance and design, sprinkled with our personal opinions of Apple’s latest MacBook.

Apple decided not to update the MacBook or MacBook Air line during its October 2016 event, and decided to cut the 11in MacBook Air completely. This suggests that the MacBook is set to replace the Air line. We suspect a refresh to the MacBook line in March 2017.

Read more: Best MacBook buying guide | Best Mac buying guide 2017 | Best cheap MacBook deals UK

New MacBook 2016 release date rumours: When is the new 12in MacBook coming out?

So, when are we likely to see the next-generation 12in MacBook? Considering that Apple only recently released the 2016 variant of the laptop, we assumed we wouldn’t be seeing another upgrade until next year, 2017.

Apple has trademarked three new MacBook mode numbers, according to a Russian trademark filing. The three new model numbers, A1706, A1707 and A1708, were tipped to be a 13in and 15in MacBook Pro, and a MacBook with a 12in screen. This is all according to the reliable KGI analyst Ming-Chi Kuo – we aren’t sure if these model numbers correspond with the new MacBook Pro models – but we will update this article once we confirm the model numbers.

We had originally expected to see the 2016 variant of the MacBook announced during 2016’s spring Apple event, which was one year on from Apple’s unveiling of the very first 12-inch MacBook models. But instead, Apple revealed the iPhone SE, a 9.7in iPad Pro and new Apple Watch straps, with no mention of an updated MacBook. A few weeks later Apple surprised us by updating the MacBook without any bells and whistles or another event.

Apple is a company of habit – new iOS software is showcased every June (along with macOS, tvOS and watchOS) which is then released alongside the latest generation iPhone months later, in September. It has been that way for more than a few years now, with the only exception being with the launch of the iPhone 4. Following Apple’s MacBook habits to date, it suggests to us that we’ll be seeing the 2017 MacBook sat on our laps between March and May 2017.

New MacBook 2016 rumours: Will the 12in MacBook replace the MacBook Air?

In October 2016 Apple showcased four new MacBooks, none of them an Air model. It seems Apple wants us to believe that it hasn’t officially killed off the Air, but it all looks like an indirect confirmation of the 12-inch MacBook replacing the Air in Apple’s affection and ongoing product portfolio.

Apple’s MacBook Air design is now eight years old, and it’s quite possible that the MacBook is lining up to replace it in the near future. When the MacBook Air first launched, its biggest selling point was its thin and light design, hence the name; but the MacBook now outshines it in those areas. To be honest, barring a major and revolutionary redesign it seems unlikely that the MacBook Air has much of a future ahead of it. Plus, for those looking for ultimate portability there’s the new iPad Pro with a 12.9in screen.

Content continues below

The last time there was a Mac laptop that had more advanced specs than a more expensive model was the old MacBooks (white and black, and then eventually aluminium). Those were eventually discontinued and the price of the MacBook Air reduced. It seems likely that the same will happen with the new MacBook models replacing the MacBook Air models at a lower price than they are now, at least eventually – especially considering the MacBook Air’s less-than-exciting 2016 update.

According to trusted Apple analyst Ming-Chi Kuo, the 12in MacBook is now Apple’s best-selling computer, closely followed by the 13in MacBook Pro, which adds further fuel to the rumour that it’ll soon replace the MacBook Air thanks to its popularity.

KGI Securities analyst Ming-Chi Kuo also claims that Apple is planning to introduce a 13in MacBook to sit alongside the 12in model in the third quarter of 2016. Kuo is a goldmine for inside Apple information and has been called “the most accurate Apple analyst in the world”, providing accurate rumours regarding the iPhone 6s months in advance of its release, along with a flurry of predictions about the upcoming iPhone 7which many assume to be true. Current rumours suggest an October hardware event where Apple will announce the 2016 MacBook Pro – will the company announce a larger MacBook alongside it?

However, while Kuo is usually accurate, we’re not too confident about this one. The rumour hasn’t been backed up by any leaks or other sources, and it seems like a pretty strange move to release a new MacBook only 1in larger than the current model, so it’s best to take this with a pinch of salt. If true, we think it signals the end of the MacBook Air range. See more MacBook Air rumours here.

New MacBook 2016 release date rumours UK: UK price

While we’re still a way away from the official announcement of the 2017 MacBook, we can already speculate about the pricing as Apple rarely changes the price of its range from generation to generation, unless it’s a fairly hefty upgrade.

With that being said, the 2016 MacBook Pro will set you back £1,449 for the basic variant and £1,949 for a more powerful variant – and the prices have all gone up since Brexit too (Basic Air is £100 more, and basic MacBook is £200 more expensive!)

New MacBook 2016 release date rumours UK: Design and features

Looking at the change in design from the 2015 MacBook to the 2016 MacBook, it suggests that we won’t be seeing huge physical changes. In fact, the only change in design from the original MacBook and the 2016 MacBook was the addition of a new colour option, Rose Gold, to go alongside the readily available Gold, Silver and Space Grey options.

Aside from that, the design hasn’t changed for the MacBook. It’s incredibly thin at 13.1mm, and it weighs just 0.9kg, making it 24 percent thinner than the MacBook Air, and we don’t expect that to change dramatically in future.

Will the 2016 MacBook have a Force Touch keyboard?

Update 14 October: According to 9to5Mac, Apple is in talks with the Foxconn startup, Sonder – a company that uses E Ink technology to display its keys (see a video here). This allows a way of customising keys and even adding symbols which would not be possible on a regular keyboard. It’s rumoured that Apple will use this technology in their next MacBook.

Back in autumn 2015, it emerged that Apple had filed a patent that appeared to show its design for a Force Touch capable keyboard. Along with the 2015 MacBook Pro, the 2015 MacBook has a Force Touch trackpad, which gave electric pulses that feel like clicks, but is a glass plate that doesn’t actually move. Like on the iPhone 6s, you can press harder for a deeper click to access menus and options within certain apps. The new MacBook also has keys unlike any other Mac, which have very little travel in order to make the chassis ultra-thin.

The newly discovered patent shows what seems to be a whole keyboard and trackpad area fit to house this technology.

As this shows, the whole keyboard and trackpad, plus areas to the left and right of the pad, could theoretically be customised to the user’s tastes and, for the first time, not have a physical keyboard. However, we have seen Apple file patents in the past that are to bookmark ideas for the future.

It’d be amazing if this technology were included in the new MacBook next year, but we feel this is one for the coming years. It would potentially allow you to have several language keyboards saved and switch between them on the adaptable display. We can but dream.

Imagine typing on a surface that felt like a keyboard, but was actually electric feedback telling your brain you’re pressing keys? If this is Force Touch’s future, we are excited.

Will the MacBook feature an Apple Pencil-compatible trackpad?

It’s not the only new addition to the MacBook either, if the latest patent approval is anything to go by. According to a patent filed by Apple which was recently approved, an upcoming Mac could boast compatibility with the Apple Pencil – although the Apple Pencil depicted in the patent is far more advanced than the one on sale at the moment. The Pencil in question features a number of sensors that could detect movement, orientation and depth and, according to the patent, could be used with a Mac as an ‘air mouse’ or possibly even a joystick for gaming.

The patent reads: “Inertial sensor input may be gathered when operating the stylus in one or more inertial sensor input modes such as an air mouse mode, a rotational controller mode, a joystick mode, and/or other inertial sensor input modes.

It doesn’t end there, either – apparently an upcoming Mac trackpad will feature Apple Pencil support, allowing users to use and draw directly onto the trackpad with the precision of the iPad Pro. While the patent doesn’t mention whether the trackpad will be built into a MacBook or offered as a standalone Mac trackpad, we imagine that if Apple plans on utilising the patent, it’ll do so with its newest line of laptops – the MacBook.

Will Apple discontinue Thunderbolt?

One question that has arisen is whether the introduction of USB-C spells the end of Thunderbolt. We don’t think that Apple will drop Thunderbolt from its Pro Mac line up any time soon, but the standard may well disappear from the consumer level Macs eventually.

The reason we think it will remain on the MacBook Pro, Mac Pro and the iMac is Apple’s efforts to convince the industry to adopt it since its introduction in 2011. However, Apple also promoted FireWire to the industry and eventually removed that from its Macs.

New MacBook 2016 release date rumours UK: Tech specs

What can we expect to see from the 2017 MacBook in terms of design? While rumours are scarce at these early stages, there is one interesting rumour that, if true, could herald in a new generation of Force Touch-enabled keyboards for Apple’s laptop line.


The next-generation MacBook is likely to feature next-generation Intel processors, as well as graphics and RAM upgrades. Intel has started shipping its Kaby Lake processors: that’s the generation of chips after Skylake, and offers support for Thunderbolt 3, USB 3.1 and DisplayPort 1.2.

But there’s another, less predictable, possibility. The Dutch-language site Techtastichas spotted a reference in the kernel of macOS Sierra to “ARM HURRICANE” being supported.

This isn’t a chip family that anyone has heard of, but it’s probably an Apple custom ARM chip: the A7 (in the iPhone 5s) was codenamed Cyclone, the A8 Typhoon and the A9 Twister. Apple might be about to put ARM chips in its new MacBooks.

Will the 2016 MacBook have LTE connectivity?

It seems that sharing your iPhone’s cellular connection with your MacBook wasn’t enough for Apple, if the latest patent approval is anything to go by. The patent, as described by the US Patent and Trademark Office, will allow the company to embed LTE hardware in the 2017 MacBook, making it the first cellular-enabled Mac in Apple’s range, past or present.



[Source:- Macworld]


Why SQL Server 2005 end of life is good news for DBAs

The end of support for a product as wide-reaching as SQL Server can be a stressful time for the database administrators whose job it is to perform upgrades. However, two database experts see SQL Server 2005 end of life on April 12 as a blessing in disguise.

Bala Narasimhan, vice president of products at PernixData, and David Klee, founder and chief architect of Heraflux Technologies, said SQL Server 2005 end of life presents the opportunity DBAs need to take stock of their databases and make changes based on what newer versions of SQL Server have to offer.

SearchSQLServer spoke to Narasimhan and Klee about the best way for DBAs to take advantage of the opportunity that the end of support creates.

This is the first part of a two-part article. Click here for the second part.

How can we turn SQL Server 2005 end of life into an opportunity for DBAs?

The end of life gives you a chance to look back at all of the innovations on the database side and on the infrastructure side as well.

Bala Narasimhanvice president of products at PernixData

David Klee: I’ve been a DBA. I’ve been a system administrator. I’ve been an IT manager and an architect, and a lot of these different components overlap. My biggest take on it, from the role of the DBA, is that their number one job is to make sure that the data is there when you need it. Secondly, it’s about performance. The upgrade process is, in my eyes, wonderful, because the new versions of SQL Server 2012 and 2014, soon to be 2016, give you a lot more options for enterprise level availability. [They simplify] things. [They give] you better uptime. [They give] you better resiliency to faults. These are features that are just included with [them].

What this is doing is giving people a good opportunity to get the stragglers out of their environment. I’m out in the field a lot. I do see a lot of 2005 machines out here. It’s one of those things where the management mindset is: “If it’s not broke, don’t fix it.” But, with end of life, end of support it’s pretty significant



Find more PRO+ content and other member only offers, here.

  • E-Handbook

    Shining a light on SQL Server storage tactics

Bala Narasimhan: I’m similar to David in terms of background, except that I did R&D at Oracle and other database companies. Since 2005, there has been a lot of innovation that has happened at a lot of database companies on the database side itself, but also on the infrastructure side holding these databases. I think it’s an opportunity to leverage all of that innovation as well. The end of life gives you a chance to look back at all of the innovations on the database side and on the infrastructure side as well. Sometimes, those innovations are complementary and sometimes they’re not. It gives you an opportunity to evaluate those and see what’s right for you in 2016.

In [SQL Server] 2014, there are features such as the columnstore and in-memory computing and all of that. … It may be the case that you can leverage similar functionality without having to upgrade to 2014, because there are other innovations happening in the industry. This may be another example of where you can step back and [ask yourself], “Should I upgrade to 2014 to get there? Or should I upgrade to 2012 because I don’t need it? Or is there another way to get the same capability?”

We’re both advocating for the right to tool for the job.

Klee: Exactly. I don’t think that there is a specific answer to that. I think it depends on what that particular DBA wants and what that particular business is trying to achieve. There are multiple ways to achieve that and this is giving you an opportunity to evaluate that.

What are your suggestions for how DBAs can best take advantage of this upgrade?

Narasimhan: This is a time to take a step back. I would recommend having a conversation that includes the DBA; the storage admin; and, if they’re virtualized, the virtualization admin as well and try to understand what all three are trying to achieve because, at the end of the day, you need to run the database on some kind of infrastructure. In 2005, it needn’t have been virtualized, but, in today’s world, it will most probably be virtualized. So, bring them all to the table and try to understand what they need to do from a database perspective and an infrastructure perspective.

Once you’ve done that, there are other conversations to have, such as: “Do we want to run an application rewrite?” For instance, if you’re going to upgrade from 2005 to 2014 because you want to leverage the in-memory capabilities of SQL Server, then you need to revisit your database schema. You need to potentially rewrite your application. There are the cardinality estimation changes that will cause a rewrite. Do you want to incur those costs? Sometimes the answer may be yes and sometimes the answer may be no. If so, it’s not required that you go to 2014. You can go to 2012.

Similarly, it’s a chance to say this application has evolved over time. The optimizer has changed in SQL Server. Therefore the I/O capabilities have changed. Maybe we should talk to the storage admin and the virtualization admin and figure out what kind of infrastructure we’ll need to support this application successfully post-upgrade.

I will, therefore, open up the conversation a little bit and bring other stakeholders to the table before deciding which way to go.

Klee: My take on it is pretty much aligned with that. It’s, essentially, look at the architectural choices that went into that legacy deployment — high availability, performance, virtualization or no virtualization. Revisit today and see if the technology has changed, or you can simplify some of those choices or even replace them with features that weren’t even around back in the day. Availability Groups, virtualization, even public cloud deployments, any of the in-memory technologies, they were just not around back in the 2005 days, and now they’re just extremely powerful and extremely useful.



[Source:- searchsqlserver]

Facebook for Android quietly adds support for uploading HD video

best new android apps

The new HD video settings option was first reported by Android Police, which adds that the feature is apparently rolled out to most, but not quite all, of Facebook’s Android users. The report also says other new video features are appearing for some users as well, but they are not yet available for everyone. They include picture-in-picture video, the ability to download clips to watch offline later, and specific resolution upload options (from 72p all the way up to 360p).

Finally, up and down arrows are showing up for some Facebook users in the app’s notifications view. This will allow users to check their notifications in either direction, as opposed to going back to the top view all the time.

Again, not all of these features are available yet for all Android Facebook users, but they appear to be in testing for a wide release sometime in the very near future. Do you plan to use the new HD video upload option for your clips?



[Source:- Androidauthority]

China didn’t steal your job—I did

China didn't steal your job—I did

The most discussed issue in the last election was the plight of the so-called white working class. The story goes that hardworking people had their jobs shipped to Mexico thanks to NAFTA. The second idea is that immigrants have stolen working-class jobs. The kicker is to blame the nation of China.

These ideas attempt to explain why the Rust Belt is idle, but they’re all wrong. Neither the Mexicans nor the Chinese stole those jobs. I did.

I didn’t do it alone, of course. You and the other members of the technology industry that came before us did the bulk of the work. And guess what? If factories come back to the United States as a result of new policy, they will be run by robots.

The boom in the use of less expensive labor overseas was fueled by cheap shipping costs and a simple labor-versus-capital decision. The cost of investing in new equipment in the United States is higher than employing people overseas to produce an item. In some industries, investing in capital is simply riskier. Think about fashion or the latest toy or trinket: If you set up a manufacturing line to make it and it’s only popular for a season or a year, then you’ve risked a lot for a relatively small margin.

On the flip side, this is also why you’ve seen pharmaceutical plants remain in the United States. Thanks to long-term patent protections, it isn’t as risky to automate a factory here. In fact, between liability and the protection of trade secrets, it’s probably less risky than using cheap labor overseas and shipping product. But make no mistake, these aren’t blue-collar jobs going to high school graduates. These factories are highly automated—and monitored by white-collar workers.

If tariffs were increased on goods produced using cheap labor overseas, then of course some factories would move here. Even in those cases, very little of the work would then be done by high school graduates. Gone are the jobs done by Eminem in “8 Mile,” where someone yells “up!” and “down!” while another person stamps sheet metal with a heavy press. Robots can do that easily.

The equation isn’t much different for undocumented immigrants. Farms have invested heavily in capital equipment over the years—with the “last mile” handled by low-paid guest or undocumented workers. If those workers are ejected from the United States, you can bet agribusiness will invest in automation to replace the manual labor.

Capital tends to win in the end. Why? Technology—that is, “we”—tend to make investing in tech cheaper or more productive than labor eventually. Whether we’re designing robots to replace factory workers or developing machine learning to make administrative assistants redundant, we help justify technology purchases rather than hiring messy, expensive, unpredictable humans.

I strongly believe this is better for us all in the end, but the economic and social costs in the short-to-medium term are high. Relentless automation skews the distribution of wealth, undercutting relative economic power of the middle and working class versus the richest among us—and it ultimately hurts overall economic growth because you reduce the number of people capable of buying whatever goods the economy produces.

The solution to this problem is not as simple as “drill baby drill” or exiting NAFTA or slapping 35 percent tariffs on China. We need to take a holistic look at economic policy, education reform, and welfare spending—if not out of the goodness of our hearts, then for the long-term economic well-being of us all.



[Source:- Javaworld]

Amzer Hybrid Warrior Case for Lumia 550

Lumia 550 Amzer Hybrid Warrior Case

Product Description

Deal of the Day Sat 10th Dec 2016 Only while stocks last!

Military inspired, the outer polycarbonate layer features a rugged grenade molded design for your Lumia 550, delivering a surplus of protection. The inner layer combines a glossy and matte TPU for an eye-catching effect. Together the layers provide a durable, shock absorbing design withstanding even the worst drops and bumps. Collapsible stand lets you watch video or video chat at just the right angle. Fighting for phone protection, the Warrior case combines dual layers and a media stand for unparalleled power!

Military inspired.
Molded grenade design.
Dual layer technology.
Outer rubberized polycarbonate layer.
Interior TPU layer.
Matte and glossy TPU deliver and eye catching look.
Collapsible stand perfect for video viewing or video chat.
Precise cutouts deliver access to all ports and controls.




[Source:- Windowscentral]


SQL Server 2016 features to make DBAs’ lives easier

Image result for SQL Server 2016 features to make DBAs' lives easier

SQL Server 2016 is about to launch with a long list of shiny new built-in features along with much-needed improvements…

to important but humdrum capabilities database administrators rely on.

The upcoming release, slated for June 1, marks Microsoft’s initial go at a cloud-first version of SQL Server. It also happens to be one of the biggest releases in its history, with something for everyone, said Andrew Snodgrass, a research vice president at Directions on Microsoft, an independent analysis firm in Kirkland, Wash.

Some of the most notable SQL Server 2016 features include performance tuning, real-time operational analytics, visualization on mobile devices, and new hybrid support that allows admins to run their databases on premises and on public cloud services. Microsoft also invested in less sexy, but important, SQL Server 2016 features that hadn’t been improved in some time.

SSRS and SSIS finally get some love

Indeed, SQL Server 2016 is an exciting release for reporting and ETL practitioners, according to Tim Mitchell, principal at Tyleris Data Solutions, a data management services provider in Dallas.

SQL Server Reporting Services (SSRS), long suffering from release after release of few remarkable changes, received a significant makeover, he said. The classic Report Manager interface has given way to a brand new portal that looks and acts like a modern Web app — and it’s brandable, he noted.

The new KPI functionality makes building dashboards much easier, and the mobile reporting tools Microsoft added from the 2015 Datazen acquisition have made SSRS relevant for companies that support reporting for mobile users, according to Mitchell.

Performance tuning with the new Query Store is one of those ‘about time’ solutions.

Andrew Snodgrassresearch vice president, Directions on Microsoft

The changes in SQL Server Integration Services (SSIS) are more subtle, but significant. When the SSIS catalog was introduced in 2012, it brought many changes but one significant limitation: SSIS packages could no longer be deployed individually; instead, the entire project had to be deployed at once, said Mitchell, who is also a data platform MVP. 

“To their credit, Microsoft heard the roar of negative feedback and have changed this in 2016, once again allowing package-by-package deployment,” he said.

For those boxed in by the limitations of SSIS catalog logging, a new feature that supports custom logging levels brings freedom. Also, for those who were previously forced to install multiple versions of SQL Server Data Tools to support the various versions of SSIS, the new SQL Server Data Tool designer allows for targeting of a specific SQL Server Integration Services version when developing SSIS projects, Mitchell said.

Performance tuning, In-Memory OLTP and PolyBase

Perhaps the most useful SQL Server 2016 feature for database administrators involves performance tuning, which allows DBAs to monitor and record the full history of query execution plans to diagnose issues and optimize plans. It will be invaluable for upgrades and patching to see where changes have impacted performance, Directions on Microsoft’s Snodgrass said.

“Performance tuning with the new Query Store is one of those ‘about time’ solutions,” he added.

Other notable improvements to SQL Server 2016 are PolyBase integration, and performance features with In-Memory OLTP and columnstore indexes are finally mature enough for most companies to deploy them, according to Snodgrass.

“The supported feature set, as compared to on-disk tables, was not on parity and it made it difficult to migrate to In-Memory tables without a great deal of effort,” he said.

In addition, Microsoft raised the size limit on memory-optimized tables to 2 TB, and those memory-optimized tables can be edited. Another important SQL Server 2016 feature is the ability to combine In-Memory OLTP and columnstore indexes on a single table.

“It’s not for everyone, but there are cases where it would be great to have real-time statistics and trends available from live, transactional data,” Snodgrass said. “Right now the process is time-delayed, since it usually requires grabbing transactions at a point in time and performing analysis somewhere other than on the transactional table.”

However, Snodgrass cautioned, DBAs shouldn’t try this without the proper infrastructure. “You’d better have beefy equipment and failover configured before trying this,” he said.

PolyBase, which provides the ability to access unstructured data in Hadoop, has been in specialized versions of SQL Server since 2012. It will be included in SQL Server 2016 Enterprise edition. That means organizations that didn’t want to spend the money on big equipment can now use existing SQL Server installations to pull unstructured data, Snodgrass said.

“Of course, that doesn’t immediately solve the problem of deploying Hadoop, but it is good for the SQL guys,” he added.

JSON, live queries and analytics

JSON support is an important feature because it allows users to read and write JSON-based documents. This provides a controlled gateway for sharing organizational data with more mobile platforms. Companies have struggled to write database apps for mobile devices, because the data storage options weren’t compatible with on-premises data platforms, Snodgrass said.

“This provides a much easier method for transporting that data between mobile/Web solutions and relational database applications,” he said.

Other SQL Server 2016 features users are excited about are Query Store, Live Query Statistics and Live Query Plans (in Management Studio), according to Gareth Swanepoel, a senior data platform and BI consultant at Pragmatic Works Inc., a SQL Server software and training provider in Middleburg, Fla.

“These [Query features] represent a major improvement to performance tuning on a system,” Swanepoel said. “DBAs will have access to vastly enhanced metrics.”

In addition, SQL Server Management Studio’s release schedule has been separated from the main SQL Server releases, and it will be updated more frequently than before.

Perhaps least impressive of the new SQL Server 2016 features, according to Snodgrass, is SQL Server R Services, which supports advanced analytics with the R programming language.

“The ability to incorporate R scripts in stored procedures is interesting, but the audience is very limited and other tools out there do a good job of this,” he said. “It’s important for the long term, but I suspect adoption will be slow in the beginning.”

SQL Server 2016 editions will include Enterprise, Standard, Express and Developer. The SQL Server 2016 Developer edition, with the full capabilities of the latest SQL Server release, will be free.



[Source:- searchsqlserver]

Latest AirDroid update officially fixes seven month old security issues


Original post, December 7: A few days ago, the security firm Zimperium publicly revealed that the popular remote access management app AirDroid had some serious security issues. Today, the developer behind AirDroid has released a beta version of the app that it hope fixes these problems.

Zimperium stated that it had found that AirDroid in its current form could allow an attacker on the same network to gain full access to the owner’s device through the app. The firm actually found this flaw a number of months ago and informed the app’s development team, but for some reason that team did nothing until Zimperium make the decision to publicly reveal their findings earlier this month.

Now, the team has quickly pushed out beta of AirDroid that reportedly does fix the security problems. Zimperium is apparently going to test the beta out to make sure all the holes are closed up before the app team releases the non-beta version in the Google Play Store. In the meantime, we don’t recommend using AirDroid until it has been verified that the app has indeed plugged its security leaks. You can’t be too careful.



[Source:- Androidauthority]

Android Studio for beginners, Part 4: Advanced tools and plugins

android studio plugins and extensions

Android Studio offers a rich palette of development tools, and it’s compatible with many plugins. The first three articles in this series focused on basic tools for building simple mobile apps. Now you’ll get acquainted with some of the more advanced tools that are part of Android Studio, along with three plugins you can use to extend Android Studio.

We’ll start with Android Device Monitor, Lint, and Android Monitor–three tools you can use to debug, inspect, and profile application code in Android Studio. Then I’ll introduce you to plugins ADB Idea, Codota Code Search, and Project Lombok.

Debugging with Android Device Monitor

Android Device Monitor is an Android SDK tool for debugging failing apps. It provides a graphical user interface for the following SDK tools:

  • Dalvik Debug Monitor Server (DDMS): A debugging tool that provides port-forwarding services, screen capture on the device, thread and heap information on the device, logcat, process, radio state information, incoming call and SMS spoofing, location data spoofing, and more.
  • Tracer for OpenGL ES: A tool for analyzing OpenGL for embedded systems (ES) code in your Android apps. It lets you capture OpenGL ES commands and frame-by-frame images to help you understand how your graphics commands are being executed.
  • Hierarchy Viewer: A graphical viewer for layout view hierarchies (the layout view) and for magnified inspection of the display (the pixel perfect view). This tool can help you debug and optimize your user interface.
  • Systrace: A tool for collecting and inspecting traces (timing information across an entire Android device). A trace shows where time and CPU cycles are being spent, displaying what each thread and process is doing at any given time. It also inspects the captured tracing information to highlight problems that it observes (from list item recycling to rendering content) and provide recommendations about how to fix them.
  • Traceview: A graphical viewer for execution logs that your app creates via the android.os.Debug class to log tracing information in your code. This tool can help you debug your application and profile its performance.

To launch Android Device Monitor from your command line, execute the monitorprogram in your Android SDK’s tools directory. If you prefer to run the tool from Android Studio, choose Tools > Android > Android Device Monitor.

You might remember from Part 1 that I used Android Studio to launch my W2A example app in the Nexus 4 emulator. I then launched Android Device Monitor from Android Studio. Figure 1 shows the resulting screen.

androidstudiop4 figure1
Figure 1. The Devices tab appears when DDMS is selected.

The Devices tab shows all accessible devices, which happens to be the emulated Nexus 4 device in this example. Underneath the highlighted device line is a list of currently visible subclass objects.

I highlighted the W2A activity object identified by its ca.javajeff.w2a package name, then clicked Hierarchy View to activate the Hierarchy Viewer tool. Figure 2 shows the result.

androidstudiop4 figure2
Figure 2. The layout hierarchy of the activity screen is shown in the Tree View pane.

Hierarchy Viewer displays a multipane user interface. The Tree View pane presents a diagram of the activity’s hierarchy of android.view.View subclass objects. The Tree Overview pane offers a smaller map representation of the entire Tree View pane. The Layout View pane (whose contents are not shown in Figure 2) reveals a block representation of the UI. See “Optimizing Your UI” to learn more about the Hierarchy Viewer tool and these panes.

If you attempt to run Hierarchy Viewer with a real (non-emulated) Android device, you could experience the error messages that appear in Figure 3.

androidstudiop4 figure3
Figure 3. Hierarchy Viewer often has trouble with real Android devices.

These messages refer to the view server, which is software running on the device that returns View objects diagrammed by Hierarchy Viewer. Production-build devices return these error messages to strengthen security. You can overcome this problem by using the ViewServer class that was created by Google software engineer Romain Guy.

Inspecting code with Lint

Lint is an Android SDK code-inspection tool for ensuring that code has no structural problems. You can use it to locate issues such as deprecated elements, or API calls that aren’t supported by your target API.

Although Lint can be run from the command line, I find it more helpful to run this tool from within Android Studio. Select Analyze > Inspect Code to activate the Specify Inspection Scope dialog box shown in Figure 4. Then select your desired scope (whole project, in this case), and click the OK button to perform the analysis. The results will appear in the Inspection Results window, where they are organized by category.

androidstudiop4 figure4
Figure 4. I’ve decided to inspect the entire project.

As you can see in Figure 5, Lint has spotted a few issues:

androidstudiop4 figure5
Figure 5. Lint reports that the androidAnimation  field could have been declared private.

Lint also complained about the following:

  • A missing contentDescription attribute on the ImageView element in main.xml hampers the app’s accessibility.
  • The root LinearLayout element in main.xml paints the background white (#ffffff) with a theme that also paints a background (inferred theme is @style/AppTheme). Overdrawing like this can hurt performance.
  • The dimens.xml file specifies three dimensional resources that are not used. Specifying unused resources is inefficient.
  • On SDK v23 and up, the app data will be automatically backed up and restored on app install. When you specify an @xml resource that configures which files to backup, consider adding the attribute android:fullBackupContent on the application element in AndroidManifest.xml; otherwise you might face a security issue.
  • Support for Google app indexing is missing.
  • I stored android0.png, android1.png, and android2.png in drawable, which is intended for density-independent graphics. For a production version of the app, I should have moved them to drawable-mdpi and considered providing higher and lower resolution versions in drawable-ldpi, drawable-hdpi, and drawable-xhdpi. No harm is done in this example, however.
  • Lint checked my spelling, noting the reference to javajeff in manifestelement’s package attribute, in AndroidManifest.xml.

See “Improve Your Code with Lint” to learn more about using Lint in Android Studio.

Profiling with Android Monitor

Profiling running apps to find performance bottlenecks is an important part of app development. Android Device Monitor’s Traceview tool offers some profiling support. Android Monitor offers even more.

Android Monitor is an Android Studio component that helps you profile app performance to optimize, debug, and improve yours apps. It lets you monitor the following aspects of apps running on hardware and emulated devices:

  • Log messages (system-defined or user-defined)
  • Memory, CPU, and GPU usage
  • Network traffic (hardware device only)

Android Monitor provides real-time information about your app via various tools. It can capture data as your app runs and store it in a file that you can analyze in various viewers. You can also capture screenshots and videos as your app runs.

You can access Android Monitor via Android Studio’s Android Monitor tool window. Select View > Tool Windows > Android Monitor or just press Alt+6:

androidstudiop4 figure6Figure 6. The logcat pane shows log messages for my Amazon Kindle device.

Figure 6 reveals the Android Monitor tool window, which presents drop-down list boxes that identify the device being monitored (in this case, on my Amazon Kindle Fire device) and the app being debugged on the device. Because ADB integration hasn’t been enabled, “No Debuggable Applications” appears in the latter list. Check Tools > Android > Enable ADB Integration to enable ADB integration.

After enabling ADB integration, I observed that “No Debuggable Applications” was replaced in the drop-down list with “ca.javajeff.w2a,” the package name for the W2A application that was running on my Kindle.

Below the two list boxes are a pair of tabs: logcat and Monitors. The former tab shows logged messages from the device and the latter tab reveals graphics-based memory, CPU, network, and GPU monitors (see Figure 7).

androidstudiop4 figure7
Figure 7. The GPU monitor is disabled for Android 4.0.3, which is the Android version that runs on my Kindle.

The memory monitor shown in Figure 7 reveals that the app occupies almost 13 megabytes and its subsequent memory usage is constant, which isn’t surprising because the app doesn’t make any explicit memory allocations, and the underlying APIs probably don’t require much additional memory. The CPU monitor shows only a slight amount of CPU use via a narrow red line about 1 minute into the monitoring. This usage arose from clicking the Animate button several times. No networking activity is displayed because the app isn’t making network requests. Finally, the GPU monitor is disabled because I’m running an older version of Android (4.0.3), which doesn’t support GPU monitoring.

The left side of the Android Monitor tool window contains a small tool bar with buttons for obtaining a screenshot (the camera icon), recording the screen, obtaining system information (activity manager state, package information, memory usage, memory use over time, and graphics state), terminating the application, and obtaining help. I clicked the camera button and obtained the screenshot shown in Figure 8.

androidstudiop4 figure8
Figure 8. Click the camera button on the left side of the Android Monitor tool window to obtain a screenshot.

See “Android Monitor Overview” to learn more about Android Monitor.

Extending Android Studio apps with plugins

Android Studio’s plugins manager makes it very easy to find and install plugins. Activate the plugin manager by selecting File > Settings followed by Pluginsfrom the Settings dialog box:

androidstudiop4 figure9
Figure 9. The Settings dialog box shows all installed plugins.

Next, click Browse repositories . . . to activate the Browse Repositories dialog box, which presents a full list of supported plugins:

androidstudiop4 figure10
Figure 10. The pane on the right presents detailed information about the selected plugin.

I’ll introduce three useful plugins–ADB Idea, Codota Code Search, and Project Lombok– and show you how to install and use them.

ADB Idea

ADB Idea speeds up your day-to-day Android development by providing fast access to commonly used ADB commands, such as starting and uninstalling an app:

androidstudiop4 figure11
Figure 11. Click Install to install ADB Idea.

Select ADB Idea in the repository list of plugins and then click the Install button. Android Studio proceeds to download and install the plugin. It then relabels Install to Restart Android Studio. Restarting activates ADB Idea.

Android Studio lets you access ADB Idea from its Tools menu. Select Tools > Android > ADB Idea and choose the appropriate command from the resulting pop-up menu:

Jefandroidstudiop4 figure12
Figure 12. Select the appropriate ADB command from the pop-up menu.

The app must be installed before you can use these commands. For example, I selected ADB Restart App and observed the following messages as well as the restarted app on my Amazon Kindle device.

androidstudiop4 figure13
Figure 13. Each message identifies the app, operation, and device.

Codota Code Search

Use the Codota Code Search plugin to access the Codota search engine, which lets you look through millions of publicly available Java source code snippets (on GitHub and other sites) for solutions to coding problems:

androidstudiop4 figure14
Figure 14. Click Install to install Codota Code Search.

To install this plugin, select Codota in the repository list of plugins and then click the Install button. After Android Studio has downloaded and installed the plugin, it will relabel the Install button to Restart Android Studio. Restarting activates Codota Code Search.

Android Studio lets you access Codota Code Search by right-clicking on Java code in the editor window and selecting the Search Open Source (Codota) menu item (or by pressing Ctrl+K), as shown in Figure 15.

androidstudiop4 figure15
Figure 15. Click Search Open Source (Codota) to access the Search Codota dialog box.

Android Studio responds by displaying the Search Codota dialog box whose text field is blank or populated with the full package name of the Java API type that was right-clicked. Figure 16 shows this dialog box.

androidstudiop4 figure16
Figure 16. Press Enter to initiate the search for Java code snippets related to ImageView.

Codota Code Search passes the search text to the Codota search engine and presents vertically scrollable search results in a CodotaView tool window.




[Source:- Javaworld]

Node.js update makes JavaScript VMs future-proof

Node.js update makes JavaScript VMs future-proof

The Node.js Foundation and NodeSource are moving the Node.js platform toward greater module stability, better security, and more independence in the use of JavaScript virtual machines.

Working with IBM, Intel, Microsoft, and Mozilla, the Node.js Foundation today unveils Node.js ABI (Abstract Binary Interface) Stable Module API. This effort would define a stable module API independent from changes in V8, which has anchored Node. In addition to the API work, the Node.js build system will begin producing nightly builds of node-chakracore, which has Node running with Microsoft’s ChakraCore JavaScript engine.

The API constitutes a first step toward JavaScript virtual machine neutrality, said Arunesh Chandra, Microsoft senior program manager. “This API is going to help the native module developers to guarantee an ABI-stable API surface for Node.” The ability to use JavaScript engines other than Google’s V8 could expand Node’s use in areas like mobile computing and the internet of things, according to the Node.js Foundation.

The ABI-stable API guarantees that changes that happen at a VM level will not require a new version of Node.js, said Dan Shaw, CTO of Node technology vendor NodeSource and a member of the foundation’s board of directors. With the change, users can migrate from a given version of Node to the next version without having to recompile Node native code modules.

“Think of this as a shim in between Node and the JavaScript virtual machine and the native packages,” said Gaurev Seth, principal program manager lead at Microsoft. Native modules can start targeting this middle layer and become “future-proof,” he said. It will be become easier to both upgrade Node versions as well as NPM’s, and developers will find it easier to migrate to newer versions of V8.

“[The API] allows Node to be highly optimized for different types of devices, scenarios, and workloads,” enabling different virtual machines to be used for specific devices, Chandra said.

Also this week, the foundation will take over the Node.js Security Project, which provides a unified process for finding and disclosing security vulnerabilities in the Node ecosystem. The foundation will take over the project from Lift Security.

Addressing Node NPM module dependency issues, NodeSource is introducing NodeSource Certified Modules. The company will curate modules that are publicly available in the NPM registry, certifying them for security and dependencies. The service, currently offered in a private beta stage, addresses predicaments like the left-pad issue earlier this year, in which an NPM with 17 lines of code was removed from the registry and caused other NPMs dependent on it to fail. NodeSource Certified Modules will never get unpublished, the company vows.

NodeSource also is introducing NSolid version 2.0, an upgrade to the company’s commercially supported version of Node, featuring security enhancements. These include runtime package vulnerability monitoring and customizable application security policies. Vulnerability monitoring is provided by security research firm Snyk, which can find issues such as distributed denial-of-service issues. Also featured is a guaranteed 24-hour response to security updates in the core Node project.

To improve reliability, version 2.0 features CPU profiling, heap snapshots, and async activity. The release also can be augmented with external tooling for performance monitoring and diagnostics. NSolid is available on AWS Marketplace, for one-click deployment of the NSolid runtime and console on the Amazon Web Services cloud. The platform also supports orchestration frameworks, including Kubernetes, OpenShift, and Cloud Foundry.



[Source:- JW]