2016 between container, hyper-convergence, cloud and bigdata.

Year 2016 could be defined as the year of changing and challenges, where vendors, customers, integrators and developers are the main characters of a story made of cloud integrations and automations, analytics, security and performance in every infrastructures. In this scenario Italy doesn’t play an important role, because depression , laws (missing) and culture gaps don’t go in the same direction of innovation and research. But thinking about Italian sysadmins and developers, there are difficult situations that make great opportunities.

Let’s see what I have seen during 2016 about infrastructures and cloud services and what’s my expectation for 2017.

Client desktop, wearable and smartphone

Trend of the last year (or better the past 2 years) is bring the office in a hand. Client devices in this new model looks like very powerful tablets with keyboard: more portable way to do any work without compromising performance and freedom to connect also my USB device. In this field, vendors like Microsoft, DELL, Acer are moving into the era of hand portable technology, leaving the idea to make a big screen and an extended keyboard and embracing concepts like compact, lightweight with a nice touch screen. The launch of Windows 10 signed also an important step in Microsoft house, indeed this Client OS finally comes as a lightweight OS, which could run with a relatively poor hardware requirements (2GB RAM).

In Apple’s house, the iPad PRO, the new MacBook Pro (with a redesigned concept of function keys) and iPhone7 are the newest products of the year (or in the air Occhiolino ) but the risk for the Cupertino’s Big is to make this line of products too much expensive than the competition.

The growth of wearable devices confirm the marketing prediction of 2016: iWatch and similar are the required step to the future of the information and communication… but Wait! Was it a past prediction?

From client to enterprise computing

Another side of the IT sees the app-in-cloud explosion that has radically transformed the way to develop and maintain the applications: now we are taking about “Cloud Native Application”. This change is starting from infrastructure layer (physical and virtual) and virtualization represents the must you have to start thinking “to be in cloud”. Automation and integration are the other mandatory elements that, near the virtualization, guarantee SLA and time to market. From this side, important changes of direction comes from DELL and EMC (now the same company), sustained by the improvement of virtual infrastructure VMware vSphere and Microsoft HyperV and the way to think software defined storage and networking: they are moving to hyper-convergence.

From application perspective the is a little earthquake, because the growing of Cloud Native Application and the explosion of the cloud services is causing the decrease of in premise and Tier-1 applications. The main reason of this change of direction is the lack of efficiency and the scalability with infrastructure expanse that sometimes is difficult to justify. For this reason, today, developing in premise application could be considered an investment with high risk!

Now it’s time to redefine new application with another objective: be cloud native and be hybrid or off premise! And this is what new software (or better new service) companies are doing in these last 2 years (Italy included).

The end of SAN… maybe or not?

Applications rule the infrastructure choices: they need to be in SAN and IOPS, Capacity and scalability are the parameters that could drive storage vendor, model and configuration: FC/iSCSI, Active-Active/Active-Passive, RAID, Flash/SSD/SAS/NLSAS… But it seems that trend is changing: from infrastructure side, the a as-a-service model is influencing the way to think storage. Is not a secret that some critical applications are now running in vSAN. And this is one of the reasons why important storage vendors are changing their minds and where to store datas:

  • new applications are moving to hyper-convergence solutions
  • for business intelligence application, the mix of hype-converged and big-data systems could be an interesting solution.
  • old application are placed under cost/benefits evaluation to choose the infrastructure type that could fit the business requirements.
  • legacy applications remain in SAN

About SAN what’s new about technology improvement? The new challenge is to predict and measure storage parameter to really address application needs. All Flash Array (AFA) is the top of the solution, giving low latency and high performance during application lifecycle, but thinking about Nimble Storage, there is a new way to deliver capacity, performance, efficiency and predictable behavior with simplicity. Thinking about 2017, is the storage administrator threatened with extinction?

VDI or not VDI… This is the question.

There are more than one reason why VDI are not in the innovation mainstream of the enterprise companies. It’s known that VDI now could have the same user experience as physical workstation, but companies are still worried about its adoption: between physical and virtual the cost is quite the same. But sometimes is still missing the real reason why adopt VDI: automate and secure all end user computing. And what about Windows licensing? No VDI as-a-service and additional license still exists if you want to virtualize your workstation. This last is one of the reason why some companies are thinking about Linux based VDI; but the road is still long before consider it a definitive solution. Let’s see what happens during 2017 and hope Microsoft will changes license policy.

Focusing on governance and BYOD, VMware Workspace One and VMware Horizon Flex are interesting solution to keep business in their company with freedom and security in the same place. From this side there is more to do and more to evangelize to keep warm customers and system integrators. Let’s see what happens while end user security becomes in the focus for every companies.

From VMworld…

The annuncement from VMworld 2016 Barca are in 2 main areas:

  • Cloud services developed by a partnership with IBM
  • vSphere and Amazon AWS integration

The vSphere release 6.5 comes with important improvements and integrations, like the ability to handle single or multiple virtual datacenter in a single pane of glass, and new vSAN improvements to boost up the concept of software defined datacenter. I think we’ll see implementation and case in main vertical areas; but like all upgrades it’s important to keep the same gold rules before proceed:

  • check all other components in environment (NSX, vRA,…)
  • check hardware/software matrix
  • check known bugs and see information about fix releases.

Something interesting comes from this blog post: http://anthonyspiteri.net/vsphere-6-5-whats-in-it-for-service-providers-part-1/?utm_content=buffer783c8&utm_medium=social&utm_source=twitter.com&utm_campaign=buffer

VMware, Microsoft and Linux in the journey to container

The buzzword of the last 2 year is container. Quite all vendors are moving in this direction, proposing their container vision and integration:

  • The release of VMware vSphere Integrated Container and Photon Controller, mark an important step in VMware product history and in it’s ecosystem.
  • Microsoft with Windows Server 2016, has done an important step into new vision of system delivery: nano-server and container to secure workload across hypervisors and server instances.
  • Docker growth is what devops are expecting during this year: the important announcement you the year are Docker Datacenter and the integration with Windows and Azure.

I want to highlight this great talk by Docker in Tech Field Day 12:

https://blog.docker.com/2016/12/get-docker-talks-tech-field-day-12/

and an amazing presentation during Emerging Technology Summit 2016 by Stephen Foskett:

Security and Data protection

2016 registered a record of data breaches. Yahoo with 500 million accounts stolen, is the example which confirm that security is NOT an optional in IT. In Italy this sounds like false alarm,  I heard some company says: << We are not yahoo! Who cares about my data?>> In the meantime crypto-locker was invading their shared NAS. Well! Not! some of this companies payed a lot of money to get back their data. <<WOW! You don’t pay now to pay double tomorrow? That’s amazing! Not! >>

But this is not only an Italy issue… the real improvement in the security comes from design decision and first data analysis (aka vulnerability assessment): before start building IT project, before performance, availability and scalability it’s necessary to think:

  • how to protect your data
  • how to recover your data

The first one includes network protection, anti-malware and backup/DR. But some data breaches comes from human mistakes and cultural gap (I heard this multiple time!). The real question that all companies must to do in the end of 2016 : << Are you safe against data breaches? >>. Splunk, RSA and VMware NSX are great tools but remember: the first security comes from the people!

NSX, microsegmentation and container

Freedom with security is the mantra adopted by some vendor, which likes to aid designer and business to do better their work without compromise the security. This is what VMware is doing over the years, to ensure that workload explosion should not become a very high attack surface.

But what about containers? If the ratio physical server to virtual machines is 3:50, containers could represent a real workload explosion and a dangerous security problem. Fortunately, for some isolation reasons, containers should be more secure than virtual machine operative systems, but the attack surface is growing and in some scenario the perimeter security is not enough.

Microsegmentation is the new way to think security in a datacenter: starting few years ago, now is watching to the next level, providing security in containers. For this reason VMware in vSphere Integrated Container have a ratio 1:1 with lightweight virtual machine and container flowing all container traffic in a distributed vSwitch. In this way all networking inside and outside application environment could be software defined and could be controlled by DLF (NSX component).

IMHO: IaaS, PaaS or better CaaS (Container as a service) are the great use cases for VMware vSphere and NSX.

Data Protection and Availability

The panorama of data availability is growing more around virtualization. But virtual system administrator are looking for a product that combine backup and disaster recovery in a single solution.

Veeam with availability Suite 9.5 is confirming its hard working with VMware and Microsoft providing different backup, replication and restore granularity: File, VM and SAN Disk. One of the recent interesting integration is with Nimble Storage, providing data protection with no impact to workload.

Cohesity is boosting up it’s growing: acquiring more technical, pre-sales and sales staff (some of them come from Pernix, VMware, Nutanix) is going to rock in VMware ecosystem with it’s “secondary data life”.

Nakivo backup and replication is going under my spotlight, because the ability to protect Amazon EC2 Workload and the granular instant restore of files, Microsoft Exchange objects, and Microsoft Active Directory objects, make this product really interesting. If you want to know more, let’s see next blog posts during this year.

IoT, Bigdata and Smart metering

Another trend in many verticals area like Energy, Government , Health Care and Manufacturing is internet of Things (IoT). A new ecosystem has its foundation in the way to bring data in the “air” and a consolidate method to store and consume it. For these reason thinking about Smart City and Smart Metering, the RD groups in many companies could play important roles in the future.

I’m the example! During this year, my new half-job in the RD of the company where I’m working, is studying and make models of handling data from many IoT sources (and other sources) and integration with big-data analysis system. For this reason I’ll provide in this blog some useful information about the integration of MQTT server, Apache Hadoop, Nifi, Kafka, etc… Stay tuned!

Route to 2017

The 2017 could see some interesting challenges in some IT areas, like containers, availability features and data analysis? If the 2016 is known as the growing of cloud thinking and big-data explosion, the 2017 could be the consolidation of these concepts. Vendors, integrators, software houses and also all people which are working in IT, should remember that integration is the king. New changes are coming every months and the eco of this are reflecting in business choices. Remember that closing mind is not a right direction in a world where non-tech people (humans) are the judges of your choice.

   Send article as PDF