Apple releases public betas for iOS 9.3 and OS X 10.11.4

Every Apple operating system has a new developer beta available now, and iOS and OS X have new public betas out too.

apple beta software ios osx
It’s a field day for beta testers, as Apple has released new developer betas for all of its operating systems—iOS 9.3, OS X 10.11.4, watchOS2.2, and tvOS 9.2. Participants in the public betas can get their hands on the betas for iOS and OS X.

If you’re enrolled in the developer program, hit up the Developer Center. If you’re not a registered developer, but still want to test drive these new beta features, you can sign up for the Public Beta program. The public betas for iOS and OS X usually trail the developer beta releases by a few days. UPDATE February 23, 11:40 a.m. Pacific: iOS 9.3 and OS X 10.11.4 are now in public beta.

According to 9to5Mac, the official versions of iOS 9.3 and OS X 10.11.14 are expected to launch around Apple’s March 15th event. Here’s a breakdown of all the new feature we got.

iOS 9.3 beta 4

The all-new, system-wide Night Shift mode will automatically adjust the display colors on your iPhone, iPad, or iPod touch. Once the mode is enabled, the colors will shift “to the warmer end of the spectrum” starting at sunset and revert back to normal in the morning. The main reason is get rid of the stark bright blue light that makes it difficult to go sleep at night.

Other features in iOS 9.3 include setting a password or Touch ID to open the Notes app, and you can also set individual notes to be password-protected. ForiPhone 6s and 6s Plus users, iOS 9.3 adds 3D Touch support for a few moreApple apps, including Settings. Deep press on the Settings app icon to launch a Quick Action shortcut to tinker with Battery, Bluetooth, and Wi-Fi. And finally, iOS 9.3 supports Wi-Fi Calling for Verizon customers, following AT&T and T-Mobile.

In addition, iOS 9.3 also brings landscape support in the News app, HealthKit-enabled app suggestions in the Health app, PDF-syncing via iCloud for iBooks, better Apple Music and Maps integrations in CarPlay, as well as a few new education features for managing shared iPads in a classroom setting.

OS X 10.11.14 beta 4

This beta version of OS X El Capitan comes with support of Live Photos, the moving images that can be taken with the new iPhone 6s and 6s Plus. Prior to OS X 10.11.14, you could only view Live Photos on the Mac via the Photos app, and sharing was limited to iCloud Photo Sharing. In this developer beta, you can now view and send Live Photos via the Messages app on your Mac.

Just like in iOS 9.3, OS X 10.11.4 also brings support for making the Notes app password-protected. And you will also be able to import data from other note-taking apps, including Evernote. The official OS X 10.11.14 update will also be accompanied by a new version of iTunes.

watchOS 2.2 developer beta 4

This software update for Apple Watch gives the Maps app new features. With watchOS 2.2, you can launch the Maps app and have it instantly navigate to your given destination. You can also tap to find businesses with a “Nearby” feature powered by Yelp.

Once your iPhone gets updated to iOS 9.3, you will be able to pair multiple AppleWatches. This watchOS update will be able to recognize which Watch is active at any given time and switch between the different devices.

tvOS 9.2 developer beta 4

The software update for the fourth-generation Apple TV incorporates elements of iOS. With tvOS 9.2, you will be able to organize your apps into folders and switch between apps via a redesigned app switcher.

You will also be able to talk to the Siri Remote to dictate into text fields to search for content or even spell out your usernames and passwords. Other features coming to along tvOS 9.2 include support for iCloud Photo Library, Bluetoothkeyboards, and conference room mode.

[Source:- Macworld]

Apple enumerates cooperation with FBI in San Bernadino iPhone 5c case

In a court filing made yesterday, Apple’s head of the global privacy and law enforcement compliance team, spelled out Apple’s responses to the myriad subpoenas that the company received to collect data on the San Bernadino shooters. Manager Lisa Olle declared that in many cases, including just days after the incident, that Apple not only collected multiple data requests from the FBI, but in many cases, acted upon the request and provided compelled data on the same day that the court order arrived in Cupertino.

Olle laid out in the timetable that after an initial phone call early December 5, Apple was served four times for data. The initial request was made on December 5 for three names and nine accounts. December 6 saw requests for three accounts, with same day service. On December 16, the FBI sought information on one name and seven accounts, and Apple delivered that day as well.

The search warrant for the suspect’s iCloud device was served on Friday, January 22, a month and a half after the shooting. All of the information in Apple’s posession was handed over on Tuesday, January 26. The Ex Parte application for Apple to develop the break-in tool for the county-owned iPhone 5c at the center of the case was served on February 16.

Olle, as manager of the law enforcement coordination team, also spelled out what she believes is required of Apple to conform to the application. Above and beyond the engineering effort needed to code the break-in tool, the department head is calling for one or two facilities similar to a US government “Sensitive Compartmented Information Facility,” which would need to be “tightly controlled and monitored around the clock.” Also noted is the difficulty of communicating data, device accountability, and the need for “technical escalations” when dealing with law enforcement officials.

Laying down the groundwork for an “undue burden” defense, Olle also notes that she is expecting more law enforcement agencies to also request the use of the tool the FBI is demanding and Apple “would need to hire people whose sole function would be to assist with processing and effectuating such orders.” These new hires “would have no other necessary business or operations function at Apple” and would include paralegals, engineers, and forensic specialists dedicated to trial work.

[Source:- Macnn]

Apple board re-elected, proposals rejected at shareholder meeting

Cook tells investors that fight with FBI over privacy ‘doesn’t scare us’

On the Mac, Siri has room to grow

Siri is on nearly every Apple device but the Mac, but its debut on the desktop has the potential to change everything.

siri primary hero
That was my initial reaction when I heard the report last week that one of the major features of the next version of OS X would be Apple’s virtual assistant, Siri.

That’s a long time in the making: Siri debuted on the iPhone 4s back in 2011—seems virtually ancient, doesn’t it? Since then, the voice-activated assistant has been a staple of Apple’s devices, migrating first to the iPad, and more recently to the Apple Watch early last year and the new Apple TV last fall.

But the Mac, Apple’s longest running product line, has been left out in the cold. It’s an odd choice, given that most Macs are plenty powerful to handle the computing needs of the virtual assistant. Meanwhile, there’s been competition from other quarters, such as Microsoft’s Cortana on Windows 10 (and even iOS) and Google’s voice search built into Chrome, Android…and even iOS.

That said, Siri’s late arrival on the Mac doesn’t make it an unwelcome one. I’m just hoping that it comes with some improvements.

Always on

According to the same report that heralded Siri’s inclusion in OS X, the Mac version will—like its iPhone counterpart—respond to “Hey Siri” when it’s plugged in. That’s good news for iMac users, since it means that Siri’s always available. As an Amazon Echo user, I know the benefits of having an always-on voice interface that you can use when your hands are full or you’re far away from your computer.

But I also have some worries, both in terms of software and hardware. For one, part of the reason that the Echo works so well is because it has an array of seven carefully tuned microphones designed to pick up your voice from far away. Most Macs, meanwhile, have limited ambient noise correction on their internal microphones—and most have only one mic, unlike recent iPhones, which use a second mic for noise cancelling. Hearing you say “Hey Siri” may be possible, but it remains to be seen how well your Mac’s internal mic will understand your query from across the room, or when there’s a lot of background noise.

Another potential wrinkle is that we have many devices that can potentially respond to “Hey Siri” these days, including iPhones, iPads, and the Apple Watch. Will Siri know, intelligently, which device to trigger when the phrase is used? Or will we instead be greeted by a chorus of Siri responses from all our various gadgets? While it may not make the first iteration, I’m hopeful this might lead to the ability to have customizable phrases to trigger Siri on different devices. (Okay, maybe I just want the Star Trek experience of calling my iMac “Computer.” Sue me!)

Talk to each other

I’d also like to see Apple make Siri more aware of all of our various devices, and in bringing the virtual assistant to the Mac, I’m hopeful that we’re one step closer to that being a reality. I’d like to be able to tell Siri to look something up on my Mac and send the result to my iPhone. Or have it tell my Apple TV to start playing a certain video.

More to the point, I’d like Siri to be a little smarter and, well, a little more assistant-like. For example, when I ask Siri “How long will it take me to get to my dentist appointment today?” it will tell me that I have an appointment at 3pm—but that’s not what I asked. It’s all the more surprising, since I know my calendar appointment has the location of my dentist appointment, and will pop up a notification when it’s time to go.

In short: Real assistants are good at synthesizing information. Siri, not so much.

Of course, that’s the hard part of programming a virtual assistant or AI: creating those links between different information, which the human brain does so naturally. I’m sure Apple’s engineers are hard at work on an even smarter Siri, but in the meantime it would be great if the company would make the assistant more aware of the information it already has.

The more, the merrier

Finally, many of us have long awaited some form of third-party extensibility for Siri. Might the integration with OS X at last provide a platform on which it makes sense to test that out? Despite the recent enforcement of restrictive features like sandboxing, the Mac platform has historically provided a much more open environment, and many of the concerns that one might have on a mobile platform—limited resources, locked-down security model, etc.—aren’t as much of a worry on a desktop or laptop.

I’d love to see Apple start to open up Siri to Mac developers, letting them integrate the virtual assistant with their applications. We don’t all do our email via Mail, our browsing via Safari, our writing in Pages, or our tweeting via the Twitter app. Letting third-party developers create hooks into Siri would open up the possibility for some great new implementations and innovation, just as it has for features like 3D Touch.

Siri on the Mac is full of possibilities and opportunities for Apple, developers, and users—just as long as the company decides to actually take a chance and not just give us a warmed over version of the Siri we already know.

[Source:- Macworld]