Oracle plans two major Java EE upgrades for the cloud

Oracle plans two major Java EE upgrades for the cloud

Modernizing Java EE (Enterprise Edition), the server-side version of Java, for the cloud and microservices will require two critical upgrades to the platform. Version 8 is set to arrive in late 2017, followed by Java EE 9 a year later, Oracle revealed on Sunday.

Although Java EE already in use in cloud deployments, Oracle sees a need to better equip it for this paradigm, said Anil Gaur, Oracle’s group vice president of engineering, at the JavaOne conference in San Francisco. To this end Java EE 8, which had already been mapped out, will receive two additional sets of capabilities: one for configuration of services, and the other for health checking to monitor and manage services communications.

Oracle will publish Java specification requests (JSRs), which are official amendments to the Java platform, detailing these two efforts. Java EE 8 had been scheduled to arrive by next summer, but the additions will push the release out several months. Java EE 8 also will be fitted with enhancements previously specified, such as ease of development.

The configuration specification will enable services to scale horizontally and help specify capabilities such as quality of service. These details will be maintained outside the application code itself so when the service expires, the configuration code is still there for use with a similar service, Gaur said. With the health service specification, a consistent set of APIs will be featured so that services can communicate the health of services and developers can specify what corrective measures may need to be taken.

Java EE 9, meanwhile, will foster deployment of smaller units of services, which can independently scale. Key-value store support for using databases such as MongoDB and Cassandra is planned, along with eventual consistency in transactions. Oracle also is exploring support for a server-less model, where code is taken care of in a runtime environment. A state service and multitenancy for tenant-aware routing and deployment will also be considered, along with security capabilities for OAuth and OpenID.

Java EE has been the subject of much debate in recent months with proponents upset over a perceived lack of direction for the platform. In response, Oracle first expressedits cloud intentions for Java EE in July. “Developers are facing new challenges as they start writing cloud-native applications, which are asynchronous in nature,” Gaur said.

Vendors have begun using Java EE APIs to solve these problems. But each vendor is doing it in its own way, with consistency lacking, Gaur said. With no standard way, it is impossible to ensure compatibility of these services.

Gaur also highlighted use of a reactive style of programming for building loosely coupled, large-scale distributed applications. Moving to the cloud requires migrating from a physical infrastructure to virtualization as well as a shift from monolithic applications, he said.

Also planned for Java EE is comprehensive support for HTTP/2 beyond the support that had already been planned for the Java servlet. A Docker model, enabling packaging of multiple services in a single container, is planned. Work on both Java EE 8 and Java EE 9 is proceeding in parallel, Gaur said.

Asked whether users might wait the extra year for Java EE 9 rather than first upgrading to Java EE 8, Gaur said that might be OK for some people. But others must move at “cloud speed” and need things more quickly, he said.

Multiple parties, including Red Hat and IBM, have been pondering their own improvements to Java EE, believing that Oracle had been neglecting it. But Oracle says its silence simply indicated it had been reflecting on what to do with Java EE.

Oracle on Sunday also detailed some intentions for Java SE, the standard edition of the platform, aside from what already has been specified for the upcoming Java SE 9 platform. Plans include making it easier for developers to deal with boilerplate code by making these code classes easier to read, said Brian Goetz, a Java language architect at Oracle.

The company also wants to expand the scope of type inferences which allow for removal of redundant code while maintaining the benefit of strong static typing, applying it to local variables. But Goetz noted that this did not mean Java was being turned into JavaScript here.

Also at JavaOne, Oracle announced its intention to soon distribute the Oracle JDK with Docker, the popular Linux container platform. “We want to make Java a first-class citizen for Docker and we want to do it with a distribution model that makes sense,” said Georges Saab, Oracle’s vice president of development. Java and Docker have not been strangers to each other previously, with Docker already popular as a mechanism for providing improved packaging.

 

 

[Source:- JW]

A quick update on our rebrand to On MSFT

Image result for A quick update on our rebrand to On MSFT

Hey everyone.  Not too long ago we announced that WinBeta would be rebranding to On MSFT (pronounced On Microsoft). We’re nearing the completion of our rebrand to On MSFT and I am happy to reveal that we should be ready to launch in mid-October (barring any last-minute glitches — who said web development was an easy process?).  We’ve been working tirelessly to ensure an amazing experience for our readers.  We’ve been taking our time and ironing out bugs rather than rushing the process, which I’m sure many of you can appreciate.

Here are a few important tidbits for the new site: our Podcast will be returning in October with Sean as the host and a surprise co-host that we’re sure our readers will love.  We’ll also have special guests on the Podcast to talk about Microsoft and a forums area for community members to engage with each other.  The new site will also make it easier to sift through the latest Microsoft news, as well as load faster on mobile.  We’re really excited to get the new design up and running.

We have a few other neat surprises in store for the new site, which I’m positive our readers will absolutely enjoy, so stay tuned. Exciting times ahead! For now, keep an eye on winbeta.org for all the latest Microsoft news.  When the time comes for us to switch over to onmsft.com, we’ll let everyone know and the transition will be seamless!

Come write for On MSFT

On a side note, we are looking to expand our team with a new Editor and a new writer.  We are looking to hire someone to work between the hours of Noon and 5PM PST.  Writers who show dedication are rewarded and are also compensated. Writers who work hard can definitely make the equivalent of a well-paid part-time job or decently-paid full-time job.  As writers advance, they will have the opportunity to attend special tech events and test out the latest gadgets OEMs have to offer!  Writers also have the chance to promote and climb up the ranks!

If you are interested, here are the important steps you must follow.  Any submission that does not follow these simple steps will be disregarded for not following instructions: Send an email to “ron@winbeta.org” with “WinBeta Job” in the subject line and introduce yourself.  Be sure to include your name, your location, why you are interested in the position, any past publications you have written for, and why you believe you would be a perfect fit for the team.  You are also required to send in some examples of your written work.  Good luck!

 

 

[Source:- Winbeta]

SQL Server 2016 Release Candidate 3 now available

Image result for SQL Server 2016 Release Candidate 3 now available

We are excited to announce that our fourth and final SQL Server 2016 release candidate, SQL Server 2016 Release Candidate (RC) 3, is now available for download.

Our SQL Server Release Candidates represent important milestones in the release of SQL Server 2016, as the product is now essentially feature complete, and means that a very rich set of capabilities is now available. These include real-time operational analytics, rich visualizations on mobile devices, built-in advanced analytics, new advanced security technologies and new hybrid scenarios that allow you to securely stretch data to the cloud.

SQL Server 2016 RC 3 is the last of our publicly-available release candidates. You can try this in your development and test environments, and it is available for download today.

In SQL Server 2016 RC 3, enhancements consisted primarily of bug fixes. We continue to refine the product for general availability. For the current release notes, see SQL Server 2016 Release Notes.

SQL Server 2016 Reporting Services mobile reports and KPIs in the Android app for Power BI

We are also happy to announce a preview of the Power BI app for Android, with support for SQL Server 2016 Reporting Services. With this update you can seamlessly bring your on-premises data to your Android phone and stay on top of your business from anywhere with out-of-the-box mobile reports and KPI tracking. Read the Power BI blog post to learn more or click here to get started.

Download SQL Server 2016 RC 3 today!

To learn more, visit the SQL Server 2016 preview page. To experience the new, exciting features in SQL Server 2016 and the new rapid release model, download the preview and start evaluating the impact these new innovations can have for your business.

Questions?

Join the discussion of the new SQL Server 2016 capabilities at MSDN and Stack Overflow. If you run into an issue or would like to make a suggestion, you can let us know at Connect. We look forward to hearing from you!

 

 

[Source:- blogs.technet]

Ensuring availability during the summer season

Image result for Ensuring availability during the summer season

Richard Agnew, VP NW EMEA, Veeam, looks at delivering 24/7 availability.

Summer is here, and many are getting ready to take a well-deserved holiday. However, this does not mean that expectations of continuous access to applications, services and data should be lowered.

A modern business relies on delivering 24/7 availability, regardless of employee holidays. But what happens if the system breaks down during the week that a corporate IT manager has gone away? It will take longer than usual to get systems running again, and that in turn will impact corporate revenue and reputation.

In order to avoid this, there are three simple precautions that businesses should take during the summer holiday season to ensure that corporate applications, services and data remain continuously available.

Avoid downtime
It is no longer the case that planned or unplanned downtime will not have a direct impact on vital services, whether it is revenue or reputation. According to the 2016 Veeam Availability Report, the average cost of downtime for mission-critical applications is $100,266 per hour in the UK specifically, and 59 percent of respondents revealed their organisations’ applications encounter unplanned downtime caused by IT failures, external forces or other factors, up to ten times per year. Although employees are made aware that the system will be down for a period of time, this may still have a negative impact on productivity, profitability and workflow. A modern business requires constant and reliable data availability – especially during the holiday period when staff levels are lower.

Delete unnecessary data
Garbage data is a recognised problem, and one that can have the biggest impact on a firm’s availability. Data like this eats up resources in the data centre, and can cause poor performance and system errors. To maintain high availability, it is essential to keep garbage data under control. Common culprits are installation files duplicated at several locations, as well as virtual machines that are invisible because they have been removed from the warehouse, but not permanently deleted.

It’s easy to keep unnecessary amounts of garbage data when nobody knows what it is, and no one wants to delete it in case it’s something important. This method of keeping useless data is a legacy from the days when data protection and availability solutions were much less sophisticated, and restoring lost data was a cumbersome and difficult process. Today, data recovery is much quicker, allowing you to recover what you want, when you want. Whether you have lost a backup copy of an important piece of data or unintentionally deleted some garbage data, it is much easier to restore, usually within seconds.

Have procedures in place before the holiday season
Another equally important issue is that data recovery for any application requires spending one of the most valuable resources – time. The average downtime of critical applications in the IT systems of UK companies is five hours, an extended time for an organisation to be offline, when it could be as low as fifteen minutes. To ensure that services, applications and data are available throughout the holiday season, it is not only IT solutions that must be put in place, but also routine. Planning for restoring data in the fastest and easiest way when a problem has arisen is essential if we are to avoid unnecessary downtime and loss of corporate revenue.

Availability is as important during the holiday season as any other time, and downtime remains costly no matter what time of year it occurs. In today’s digital society, end-users are expecting organisations to be Always-On and available. Unfortunately, the average number of failures in modern enterprises is still high. According to the 2016 Veeam Availability Report, 84 percent of senior IT decision makers across the globe admit to suffering an ‘Availability Gap’ between what IT can deliver, and what users demand. This gap costs $16 million a year in lost revenue and productivity, in addition to the negative impact on customer confidence and brand integrity (according to 68 percent and 62 percent of respondents, respectively). This cost only increases as more time passes, and unless procedures are put in place before the holiday season, there is a high risk of unnecessarily long downtime and high revenue loss.

Is your data centre ready for its summer holiday?

 

 

[Source:- CBR]

Google’s Go language ventures into machine learning

Google’s Go language ventures into machine learning

Machine learning developers who want to use Google’s Go language as their development platform have a small but growing number of projects to choose from.

Rather than call out to libraries written in other languages, chiefly C/C++, developers can work with machine learning libraries written directly in Go. Existing machine learning libraries in other languages have a far larger culture of users, but there’s clearly an interest in having Go toolkits that take advantage of the language’s conveniences.

GoLearn, described as a “batteries included” machine learning library, is one of the most prominent. “Simplicity, paired with customisability, is the goal,” the developers write in their introduction to the project. Some of GoLearn’s interfaces to data handling are implemented in the same manner as scikit-learn, a popular Python machine learning project. Python refugees ought to be able to make short work of it. There’s a modest amount of C++ used for the linear models library, but the rest is pure Go.

Goml bills itself as “Golang machine learning, on the wire,” meaning it “includes many models which let you learn in an online, reactive manner by passing data to streams held on channels,” according to the developers. The project stands out by emphasizing its possibilities as a component for other applications, made easier by “comprehensive tests, extensive documentation, and clean, expressive, modular source code.” If all you need is something for basic binary classification problems (is this spam or not?), a smaller library named Hector will fit the bill.

The newest of the bunch, and in some ways most intriguing, is Gorgonia. This machine learning library, written entirely in Go, “provides the necessary primitives to dynamically build neural networks and assorted machine learning algorithms,” according to the author, a Sydney, Australia-based developer who goes by the nick “chewxy.”

The key adjective is “dynamic.” Like the machine learning library Theano before it, Gorgonia lets you describe the behavior of a neural network in fairly high-level terms with a set of library primitives. This approach, also used by the TensorFlow library, frees the developer from having to write algorithms by hand and provides pieces that can be reused across different projects.

One key motivation for why machine learning projects might be worth creating in Go was raised by Gorgonia’s creator in the Hacker News thread where he was promoting the project: “Part of the reason why I wrote Gorgonia was because I spent waay [sic] too long trying to deploy Theano on the cloud (this was circa 2 years ago).”

A purely Go solution means fewer pieces from different languages that would have to be packaged and deployed together. But the main advantage of having these libraries in Go isn’t deployment, but developer comfort. Prospective machine learning developers now have that many more languages to be productive in — and it means existing Go developers who want to become machine learning pros can get a leg up in a domain with which they’re already comfortable.

 

 

[Source:- JW]

Windows 10 Mobile news recap: New Lumia OTA firmware, HP Elite x3, and more

Image result for Windows 10 Mobile news recap: New Lumia OTA firmware, HP Elite x3, and more

Welcome back to our weekly Windows 10 Mobile news recap series, where we go over the top stories of the past week in the world of Microsoft’s mobile operating system. Let’s get started.

Intel might not be ditching mobile processors

According to a PCWorld interview with Venkata Renduchintala, president of the Client and IoT Businesses and Systems Architecture Group at Intel, his company might just be staying in the mobile processor game. While there wasn’t any firm commitment as to the future of mobile and IoT for Intel, Venkata stated that his plan for the future is fairly straightforward and that he’s open to getting into the ARM ecosystem.  Before now many people were under the impression that Intel had completely lost interest in mobile, so hearing any glimmer of hope for the company’s mobile future can only be a good thing.

New update to Mail and Calendar apps hits Windows Insiders

A new update has come to the Mail and Calendar apps for Windows 10 and Windows 10 Mobile, and it’s available to Insiders before anybody else. The update tied together the “People” app with both Mail and Calendar, and it also introduced the ability to download all attachments from an email at the same time. The update seemed to be playing catch-up with the iOS and Android versions of desktop Outlook and Outlook Mobile, which both already have contact management built in.

More Lumia users get double-tap to wake functionality

Double-tap to wake is a feature that’s been talked about for a while now, as it has slowly reached more and more Lumia users. Today, a relatively huge wave of users got access to the feature on Lumia 950 over the air, as well as the Lumia 650. The OTA update means that people who hadn’t already manually updated to the new firmware will now be able to get access to it easily through their carrier, without having to go through any other channels. While the Lumia 550 also got access to this update, they won’t be getting the double-tap to wake functionality that the 950 and 650 are.

New AdDuplex numbers show that Windows 10 Mobile users like to stay up to date

The Windows 10 Anniversary Update hit just about a month ago, and we’ve got a plethora of charts and numbers from AdDuplex to give you an idea about how it’s rolled out, as well as a few other statistics that you might want to know. As it turns out, people using Windows 10 Mobile tend to be pretty active, keeping up to date with builds. According to AdDuplex, nearly 90% of all Windows 10 Mobile users are either on the latest stable release of Windows 10 Mobile, or they’re running an Insider build. If you want to learn a bit more, check out our article covering the whole study – there’s some pretty interesting stuff on where Microsoft stands on market share, and how Windows 10 users behave.

HP Elite x3 won’t be sold in Microsoft stores until October

It’s safe to say that things have been complicated when it comes to the release of the HP Elite x3, HP’s flagship Windows 10 Mobile phone. Between delays for its upgrade to the Anniversary Update and delays to its shipping date to the United States, people who were banking on the HP Elite x3 have got to be frustrated at just how patient they’ve had to be. If you thought that the delays were over, though, you would be sadly mistaken. Those who planned to get their HP Elite x3 through a Windows Store are going to have to wait a little bit longer than everybody else, since the phone won’t be available there until October 11th. Hang in there, everyone – you’ll get your phone eventually.

 

 

[Source:- Winbeta]

A tour through tool improvements in SQL Server 2016

Image result for A tour through tool improvements in SQL Server 2016

Two practices drive successful modern applications today – a fast time to market, and a relentless focus on listening to customers and rapidly iterating on their feedback. This has driven numerous improvements in software development and management practices. In this post, I will chronicle how we’ve embraced these principles to supercharge management and development experiences using SQL Server tooling.

SQL Server 2016 delivers many SQL tools enhancements that converge on the same goal of increasing day-to-day productivity, while developing and managing SQL servers and databases on any platform. This post provides an overview of the improvements and I’ll also drop a few hints about what’s on the way. With SQL Server 2016:

  • It’s easier to access popular tools, such as SQL Server Management Studio (SSMS) and SQL Server Data Tools (SSDT).
  • Monthly releases of new SQL tools make it easy to stay current with new features and fixes.
  • Day-to-day development is being simplified, starting with a new connection experience.
  • New SQL Server 2016 features have a fully guided manageability experience.
  • Automated build and deployment of SQL Server databases can improve your time to market and quality processes.

Finding and using the most popular SQL tools is easier than ever

We received insightful feedback from customers about how difficult it was to find and install tooling for SQL Server, so we’ve taken a few steps to ensure the experience in SQL Server 2016 is as easy as possible.

Free and simple to find and install SQL tools

 

The SQL Server tools download page is the unified place to find and install all SQL Server-related tools. The latest version of SQL tools doesn’t just support SQL Server 2016, but it also supports all earlier versions of SQL Server, so there is no need to install SQL tools per SQL Server version. In addition, you don’t need a SQL Server license to install and use these SQL tools.

SSMS has a new one-click installer that makes it easy to install, whether you’re on a server in your data center or on your laptop at home. Additionally, the installer supports administrative installs for environments not connected to the Internet.

All your SQL tools for Visual Studio in one installer, for whichever version of SQL Server you use

SQL Server Data Tools (SSDT) is the name for all your SQL tools installed into Visual Studio. With just one installation of SSDT in Visual Studio 2015, developers can easily integrate efforts to develop applications for SQL Server, Analysis Services, Reporting Services, Integration Services and any application in Visual Studio 2015 for SQL Server 2016 – or older versions as needed.

SSDT replaces/unifies older tools such as BIDS, SSDT-BI and the database-only SSDT, eliminating the confusion about which version of Visual Studio to use. From Visual Studio 2015 and up you’ll have a simple way to install all of the SQL tools you use every day.

Easy to stay current – new features and fixes every month

One of the goals for SQL tools is to provide world-class support for your SQL estate wherever it may be. This could be comprised of SQL servers running on-premises or in the cloud, or some fantastic hybrid of both. We support it all. In order to enable world class coverage of this diverse estate, we have adopted a monthly release cadence for our SQL tools. This faster release cycle brings you additional value and improvements – whether it’s enabling functionality to take advantage of new Microsoft Azure cloud features, issuing a bug fix to address particularly painful errors, or even creating a new wizard/dialog to streamline management of your SQL Server.

These stand-alone SSMS releases include an update checker that informs you of newer SSMS releases when they become available. SSDT update notification continues to be fully integrated with Visual Studio’s notification system. You can keep up to date and learn more about the SSMS and SSDT releases at the SQL Server Release Services blog.

Day-to-day development is being simplified, starting with a new connection experience

Discover and seamlessly connect to your databases anywhere

No more need to memorize server and database names. With just a few clicks, the new connection experience in SQL Server Data Tools helps you automatically discover and connect to all your database assets usingfavorites, recent history or by simply browsing SQL servers and databases on your local PC, network and Azure. You can also pin databases you frequently connect so they’re always there when you need them. In addition, the new connection experience intelligently detects the type of connection you need, automatically configures default properties with sensible values and guides you through firewall settings for SQL Database and Data Warehouse.

Streamline connections to your Azure SQL databases in SSMS

The new firewall rule dialog in SSMS allows you to create an Azure database firewall rule within the context of connecting to your database. You don’t have to login to the Azure portal and create a firewall rule prior to connecting to your Azure SQL Database with SSMS. The firewall rule dialog auto-fills the IP address of your client machine and allows you to optionally whitelist an IP range to allow other connections to the database.

 

Fully guided management experiences

SQL Server 2016 is packed with advanced, new features including Always Encrypted, Stretch Database, enhancements with In-Memory Table Optimization and new Basic Availability Groups for AlwaysOn — just to name a few. SSMS delivers highly intelligent, easy-to-click-through wizard interfaces that help you enable these new features and make your SQL Server and Database highly secure, highly available and faster in just a few minutes. There’s an easy learning curve, even though the technology that’s under the hood enabling your business is powerful and complex.

 

Adopting DevOps processes with automated build and deployment of SQL Server databases

Features such as the Data-tier Application Framework (DACFx) technology and SSDT have helped make SQL Server the market leader of model-based database lifecycle management technology. DACFx and SSDT offer a comprehensive development experience by supporting all database objects in SQL Server 2016, so developers can develop a database in a declarative way using a database project.

Using Visual Studio 2015, version control and Team Foundation Server 2015 or Visual Studio Team Services in the cloud, developers canautomate database lifecycle management and truly adopt a DevOps model for rapid application and database development and deployment.

What’s coming next in your SQL tools

In the months to come, you can look forward to continued enhancements in both SSMS and SSDT that focus on increasing the ease with which you develop and manage data in any SQL platform.

To this end, SSMS will feature performance enhancements and streamlined management and configuration experiences that build on the new capabilities provided by the Visual Studio 2015 shell. Similarly, SSDT will deliver performance improvements and feature support to help database developers handle schema changes more efficiently. Learn more about tooling improvements for SQL Server 2016 in the video below.

 

Improvements like these can’t happen in a vacuum. Your voice and input are absolutely essential to building the next generation of SQL tools. And the monthly release cycle for our SQL tools allows us to respond faster to the issues you bring to our attention. Please don’t forget to vote on Connect bugs or open suggestions for features you would like to see built.

 

 

[Source:- blogs.technet]

A picture tells a thousand words: Visualising the mainframe for modern DevOps

Image result for A picture tells a thousand words: Visualising the mainframe for modern DevOps

Steven Murray, Solutions Director at Compuware, looks at the challenges of the mainframe in the app economy.

From shopping online, to scrutinising our bank balances, digital technology has become ingrained in everyday life, making applications the lifeblood of today’s organisations. Consumers expect these digital services to work seamlessly, which has driven businesses to adopt more intelligent approaches to developing and maintaining them, such as DevOps. These modern approaches enable IT teams to work more closely in order to deliver flawless digital services and updates in much shorter timeframes than have previously been possible.

However, although most of us use these digital services on a daily basis, very few realise that as we live, work and play online, the app economy is in large part underpinned by the mainframe. Despite having been around for over 50 years, even today the mainframe is responsible for crunching the numbers and processing over 30 billion transactions that power the digital economy every single day. Mobile banking, for example, relies heavily on a string of complex digital services that draw data from the mainframe, despite the core service visible to the consumer being delivered through a flashy new modern app. It’s therefore unsurprising that 88% of CIOs say the mainframe will remain a core business asset over the next decade.

The mainframe of the digital economy

Despite its central role in supporting digital services, very few modern developers have experience working on the mainframe, and even fewer understand the complex interdependencies that exist between these legacy systems and distributed applications. This is in part due to the isolated environment that specialist mainframe developers have historically worked in; the very same secluded working environments that DevOps encourages companies to move away from. As mainframe developers worked in silos, independent of others, the newer generations of developers were alienated from the platform and had little opportunity to learn from their more experienced colleagues. This has created a shortage of skilled mainframe developers as the older generations continue to reach retirement age.

In an increasingly interconnected IT environment, this skills shortage is hindering DevOps initiatives. If developers aren’t aware of the detrimental impact that a single update to one application can have on the wider ecosystem, they’ll find it nearly impossible to deliver error-free digital services and updates as quickly as they’re required to. So how can businesses keep up with rising consumer expectations and enable programmers with little to no mainframe experience to deliver flawless updates to applications on any platform?

Enter the polyglot programmers

First and foremost, companies need to work towards enabling their developers to work interchangeably on any IT project, regardless of the platform or programming language behind it. As most developers have little experience working on the mainframe, companies need to provide them with modern tools and development interfaces that are more familiar to them and can be used across any IT platform. Importantly, mainframe tools must be integrated with popular and familiar open source/distributed DevOps solutions so that developers can use the same tools for COBOL, Java and other languages.

Having one modern interface and a common set of tools across all platforms will help to unify agile software development lifecycle workflows by enabling programmers to switch seamlessly between tasks regardless of the platform and bringing the mainframe into the fold of mainstream IT. With this approach, the mainframe becomes only different in syntax where languages like COBOL are just another programming language for developers to learn.

However, enabling developers to update mainframe applications is only half the battle. They must also find a way for developers to more intuitively understand the complex interdependencies that exist between the applications and data sets that are integral to the services they’re delivering, without the need for digging through incomplete documentation or acquiring the same specialist knowledge that took their veteran colleagues years to acquire. Rather than expecting developers to manually trace the complex web of interdependencies between applications and data, modern visualisation technologies can do the legwork for them. Having the ability to instantly visualise the relationships between digital services, as well as the impact any changes they make have on the wider ecosystem, will enable developers to update mainframe applications with confidence that there won’t be any unforeseen consequences.

Mainstreaming the mainframe

Ultimately, education is needed to convince non-mainframe programmers that there is a bright future for these hugely reliable and powerful mainframe systems. To encourage this, businesses need to integrate the mainframe into mainstream cross-platform DevOps processes. The easiest way to achieve this is if the same interfaces and tools can be used across all platforms. This will also encourage collaboration, eliminating mainframe silos and promoting cross-platform DevOps, which should make programming on the mainframe easier and more intuitive to developers who usually work on distributed applications.

 

 

[Source:- CBR]

New Red Hat project looks a lot like a Docker fork

New Red Hat project looks a lot like a Docker fork

There have been rumblings about a possible split in the Docker ecosystem. Now Red Hat has unveiled a project that may not be pitched as a Docker fork, but sure has the makings of one.

The OCID project uses many Docker pieces to create a runtime for containers that can be embedded directly into the Kubernetes container orchestration system.

That version of the Docker runtime, Red Hat says, has been built for those who “need a simple, stable environment for running production applications in containers” — a broad hint that Docker’s “move fast and break (some) things” philosophy of product development has spurred a backlash.

The mother of invention

The technical details of OCID are not complicated. It’s a set of projects that provides Kubernetes with the ability to obtain and run container images by way of a version of the core of the Docker runtime — the “runC” project — that has been modified to fit Kubernetes’ needs.

Some of these modifications are purely practical, providing Kubernetes with features that are useful when running containers at scale, such as being able to verify if a current container image is the same as one found in a container registry.

Other features are more strategic variations on existing Docker functionality, with philosophical differences that stem from how Kubernetes is used in production. The OCID storage driver, for instance, “provide[s] methods for storing filesystem layers, container images, and containers,” according to Red Hat, but allows storage images to be mounted and handled more like Linux filesystems, instead of in-memory objects only known to Docker.

Fork in the road ahead

Reading between the lines of the news release, there are strong hints that the OCID project arose because Red Hat found itself at odds with the pace and path of Docker’s development.

According to the release, work on the storage component of OCID was hobbled because “upstream Docker was changing at a rate that made it difficult to build off of.” Likewise, when Red Hat proposed remote examination of a container as a possible standard add-on, “the Docker community showed little interested in such a capability.”

Chalk this up to the fact that Red Hat and Docker generally aim for different audiences. Red Hat targets enterprises that want to run applications at scale by way of a whole gamut of tools: as its newly container-centric Linux stack, its OpenShift container platform (version 3.3 was released today as well), and its focus on Kubernetes as the mechanism for combining and managing things together. The sheer size of such a stack, and the demands made on it by an enterprise, mean it can’t be built on shifting sands.

What Red Hat wants

Docker, on the other hand, has been driven more by the enterprise developer than by the enterprise itself. It isn’t afraid to iterate quickly and assume its audience is agile enough to keep up. It has also been attempting to present itself as a one-stop, end-to-end solution for deployment.

Bundling Docker Swarm as a native orchestration solution, for instance, was meant to provide an out-of-the-box option to get a cluster running — and to give Docker users a reason to use Docker-native tools generally. But Kubernetes is making a case for itself, both because of its open-ended community and because people serious about scale (such as OpenStack) tend to turn to Kubernetes as a once-and-for-all solution.

It’s not in Red Hat’s best interest to seem divisive, though. To that end, the announcement about the OCID is liberally salted with statements of open source goodwill: Red Hat wants to “drive broad collaboration” by contributing these tools back to the container ecosystem at large and by “engaging with upstream open source communities.”

But Docker is under no obligation to accept any particular pull request. And if Red Hat’s intention is to build a powerful container stack that’s distinctly its own, it will be all but obliged to diverge from Docker. The question isn’t whether Red Hat will do so, but by how much and to what end.

[An earlier version of this story incorrectly stated that the Open Container Initiative (OCI) as well as Red Hat was part of the OCID project. The OCI is not involved with OCID.]

This story, “New Red Hat project looks a lot like a Docker fork” was originally published by InfoWorld.

[Source:- JW]

Office Delve for Windows 10 makes its way to Windows 10 Mobile in Preview

Image result for Office Delve for Windows 10 makes its way to Windows 10 Mobile in Preview

Delve, the newest addition to Office 365, has still not been officially announced but that doesn’t stop the app from coming to mobile. It was a PC-only UWP applicationuntil earlier today. The app was also available on Apple’s iOS and Google’s Android operating systems, but not for Microsoft’s own mobile platform.

To download Delve, all you have to do is go to the link at the bottom of the article, but to use it you, unfortunately, need a Work or School account, as with many preview apps on the Windows Store.

But what is Delve? Well, the official description describes it like this:

Delve helps you stay in the know, powered by who you know and what they are working on. With this preview app for Windows 10, you’ll be notified about document updates, and get document suggestions that are relevant to your work. You can also find people and get back to your recent documents and attachments, all in one place – all in one app.

Key features of the app:

  • Get updates about what your colleagues are working on
  • Find relevant documents and attachments based on people you know
  • Get back to important documents you’re actively working on

[Source:- Winbeta]