DirectionIT Magazine Issue 10

Page 1

WW W .D KL .COM

D E C E M B E R 20 18


Cont nte 2

W W W .D K L .COM


tent ents

A cloud for those who can’t move to the cloud

Have you calculated your Network effect?

P.12

Part I: Analytics and Machine Learning in the Modern Data Center

P.18

How the Internet-of-Things will lead to selling lots of things

P.24

P.30

DevOps in the Multi-Platform Data Center

P.6

3


Take control of the IoT Edge Onboard, manage, monitor and secure all your connected things and gateways with Pulse IoT Center.

www.vmware.com

W W W .D K L .COM


Letter from the Editor

As yet another successful year comes to a close, the one common thread that is once again present in all business is that technology is the single greatest driver as we move into the New Year. In 2018, we saw the rise and acceptance of the need for digital transformation and all that it can deliver for business success, we witnessed the start of the silos of IT begin to dissolve with the promise of interconnectivity, and we saw the new rise of the mainframe as it solidifies itself as the powerhouse that will drive next generation data. And of course there is the data—the world’s new gold. It is now apparent that data is and will continue to be the single greatest asset for any company—a fully digital economy that will drive us to new heights and new possibilities. And with that data has come new approaches. From the days of the “Big Data” discussions has come AI to manage and analyze it—faster and better than our own human minds can manage. Perhaps that’s the greatest point of 2018 and what is next for all of us collectively in 2019—a world where AI, IoT and machine learning unlock the very best human potential. As I see these technologies unfold, I see a glimmer of hope whereby the very technology that has in some ways separated us into our own silos as human beings—the loss of real human interaction—may be the thing that all drives us closer together. In any of these cases, I sincerely thank you for your ongoing support in 2018 and wish all of you a happy, safe, and prosperous New Year!

Allan Zander Editor-in-Chief

5


Why IBM Private Cloud (ICP) could be the perfect option By Matt Hejnas, Partner, Principal Architect, Private Cloud, Cedrus Digital Transformation Solutions In today’s ever-connected business environment the cloud has come to represent the cornerstone of differentiation and competitive advantage. For most companies, it represents the environment where the majority of their business processes and applications reside. However, for many companies the public aspect of cloud computing is something that’s not a possibility: from regulatory challenges and more, the option is simply not available. That said, there are ways to experience all the benefits of the cloud without moving to a strictly public option—leveraging a possible private or a hybrid option. For instance, IBM’s private cloud offering (known as ICP) delivers all the benefits of public cloud while remaining compliant with a multitude of industry-specific regulations—enabling users to effectively manage enterprise workloads with the benefit of extra controls for security.

6

W W W .D K L .COM


7


THE IMPORTANCE OF THE CLOUD

So, why is the cloud so important? Trust me when I say I’m quite aware of that question being somewhat odd in this day and age. Most understand why the move to the cloud up until now has been so important. But that’s not where cloud ends; in fact, that’s actually where it begins. The need for companies to embrace the next phase of cloud—known to most as digital transformation—will come to represent the next evolutionary stage in computing as we know it. The cloud will now be the epicenter for all modernized business process to leverage everything from Artificial Intelligence (AI) and machine learning, to the Internet-of-Things (IoT), Digital Assistants and, of course, the new gold standard of business: ubiquitous data. In fact, it’s data that is the single greatest driving force of the new millennium—leveraged to achieve everything from better and more personalized end-user experience, all the way through to supreme business intelligence.

8

THE ADOPTION OF A CLOUD-BASED INFRASTRUCTURE

It’s for these reasons alone that organizations burdened with industry-related regulatory challenges must still embrace the cloud, albeit on their own terms. In the case of ICP, IBM has created a Kubernetes-based container platform that can help organizations to quickly adopt cloud-based infrastructure, enabling them to modernize and automate workloads. More importantly, it enables users of the platform to build new and highly innovative cloud-native applications to remain relevant and competitive in a world driven by cloud infrastructure. In this instance, the only difference is that the development and deployment of new applications takes place in a highly secure, private infrastructure within one’s own data center or hybrid model, mitigating risk of potential security concerns associated with other public cloud options. For example, one of the most prevalent business challenges today is the need to modernize traditional applications. And, as our digital world continues to evolve, the demand on companies to enable better and more efficient scalability and resilience is paramount. Of course, like any modern IT challenge, that’s always easier said than done—but there can be light at the end of the tunnel.

W W W .D K L .COM


THE ABILITY TO DEPLOY INTO THE CLOUD

ICP’s catalog includes containerized IBM middleware that is actually ready to deploy into the cloud—a benefit for those that dread the perceived long and arduous path to cloud readiness. In the case of ICP, containerization dispels those concerns, enabling users to avoid the trappings of applicationspecific breakage points when modernizing monolithic and legacy applications. By doing so, it can also reduce downtime by enabling users to simply address and isolate application interdependency issues on a singular basis without having to schedule downtime for an entire system.

THE CLOUD IS HERE TO STAY

So what does this truly mean for highly regulated business models? Simply put, cloud on its own terms. It’s clear that public cloud offerings enable far more options in planning one’s own data center environment. The elasticity afforded by the big players such as Amazon AWS, Microsoft Azure, and more, is obviously of great value. In fact, it is the single biggest reason that many organizations—private and public sectors alike—are choosing to move more infrastructure to public options. This choice makes IBM’s private cloud offering shine. Whether an organization chooses a fully private, a hybrid, or a fully public model for its cloud deployment, managing High Availability (HA) workloads becomes far easier in any case. And with the ability to provide a single deployment method at any time—in case the organization decides to deploy to the public cloud and on-premises data center simultaneously—it makes it an easy choice. In the end, the cloud is here to stay; consequently, a business of any nature requires modernity to exist. Knowing that fully public cloud options are not for everyone, it’s nice to know there is still a cloud option available that can deliver the same benefits—but on one’s own terms.

9


Freak Blue Cruiser Flying carrousel-tourbillon 7-day power reserve Silicium technology T. 561.988.8600 ulysse-nardin.com


TH E E TE RNAL MOVE ME NT Ulysse Nardin, from the movement of the sea to the perpetual innovation of Haute Horlogerie. For over 170 years, the powerful movement of the ocean has inspired Ulysse Nardin in its singular quest: to push back the limits of mechanical watchmaking, time and time again.


12

W W W .D K L .COM


HAVE YOU CALCULATED YOUR NETWORK EFFECT By Matt Edwards, Chief Operating Officer, Inteleca IT Business Solutions The data center environment for most companies as it stands in 2018 is in an interesting state of flux: the need for new equipment is on the rise (especially in the optical networking space) due to the impact of data growth. Pair that with the need for assessing the impact of digital transformation as a whole and data centers are now the epicenter for change within the corporate environment.

13


14

W W W .D K L .COM


The dilemma created by OEMs

However, conversely, companies must also mitigate risk as it relates to too much spending—a very real scenario as the balance between new equipment and maintaining realistic budgets is ever present. Add to that dilemma the paradigm that has been created by OEMs: it is sometimes “cheaper” to buy new than it is to maintain hardware, so the proverbial plot thickens even more. For instance, many OEM maintenance programs are exorbitant to say the least as their main goal is to continually push new gear out of the door with their main focus on revenue. And with that OEM paradigm comes an indispensable adjacent industry based solely on maintenance and support, and grey market equipment—all designed to offset OEM contractual maintenance costs as a cost savings measure while maintaining business norms.

The value of IT maintenance and support

However, with all of that in place—budgets and financial calculations rarely address other areas of data center equipment—calculating the network effect and how equipment relates to the value of its own operational status is where IT maintenance and support become even far more valuable. It’s a sum greater than its parts, literally.

In reality, it’s easy for the right IT maintenance provider to save you as much as 50 percent on your maintenance costs. Of course, you could argue that because the hardware and software costs of a server are negligible, why spend more money on an IT maintenance plan? It seems counterproductive, because how can you save as much as 50 percent on your maintenance costs when you are paying a provider to maintain hardware that is inexpensive and easily replaced? In this case, the math behind the equation becomes the most important factor.

The network effect as a key driving factor

This is where the network effect becomes a key driving factor. You see, in both economics and business, a network effect, which is also called network externality or demand-side economies of scale, is the effect that one user of a product or service has on the value of those products and services to other people. For instance, when a network effect is present, the value of a product or service is dependent on the number of people using it—the very essence of the data center and its equipment. For example, in typical situations servers are considered “stand alone” as they are in fact connected to many other servers, workstations, and mobile devices. Therefore, any individual server that uses any kind of virtualization technology is its own network and, by definition, no longer one server. Make sense? Therefore, when you take any individual part of a network offline for replacement, this has a disproportionate effect on your network. Although the hardware is only around 15 percent of the value of the device, the other 85 percent of its value disappears while it is being swapped out. The argument that the hardware costs little so why not just go ahead and replace it, gets completely inverted when you look at the whole environment. The low cost of powerful hardware is now the driving factor that increases the value of the server and its place in the environment. As hardware costs decrease, the relative ratio of hardware cost to total value continues to shrink, making it increasingly important to keep servers online and available, which you can do with the right IT maintenance provider. The lesson in all of this is to factor in far more than simple line-item costs when setting budgets. Downtime and how it impacts the value of a product or service can be far greater than anticipated. Having a plan in place and a partner in place to mitigate that risk is the key to ongoing success.

15




PART I:

ANALYTICS & MACHINE LEARNING IN THE MODERN DATA CENTER BY LARRY STRICKLAND CHIEF PRODUCT OFFICER, DATAKINETICS DATA PERFORMANCE & OPTIMIZATION

Artificial Intelligence (AI) and Machine Learning (ML) are the latest exciting technologies and buzz in the business and tech worlds, taking over from the past buzz like GDPR, analytics, IoT and Big Data. These topics have already (arguably) become mainstream interests for most large businesses, and for many smaller businesses as well. And like these "past" buzzwords, AI and ML are not going away and, more importantly, are going to matter to your business at some point. Are they approaching mainstream now in 2018? Well, that's another story altogether.

18

W W W .D K L .COM


19


HISTORY Most will realize that machine learning has been around for quite some time; in fact, the term was coined in the 1950s when ML was not much more than a concept. Interest and understanding began and grew, but the compute power needed to start real experimentation did not exist. Not even close. Only recently has that level of compute power become available, and that is why there is an upsurge in business interest: everyone sees the potential and realizes that tangible benefits are tantalizingly close. Businesses have boatloads of data: data about their customers, their products and services, themselves, their competitors, and everything else under the sun. They’ve started running analytics on it and now they’re hungry to take the next step; to use it as an input for ML algorithms, and to develop the AI to obtain more valuable insight.

IT’S ALL ABOUT THE DATA I have always been a big proponent of starting small, whether we’re talking about Big Data, IoT or analytics. There’s nothing wrong with starting small to get your feet wet, to determine whether or not you’re on the right track, if you have the necessary internal expertise available, etc. Starting small means that you’re actively doing something, as opposed to waiting (forever) until you’re sure everything is in place before moving forward at all.

Known to self Things that are known by you and also seen and acknowledged by others

Unknown to others

Hidden

Corporate data is all over the place; mostly because businesses rely on the best computing platforms for the jobs at hand. For example, data from transaction processing is often resident on the mainframe because the mainframe is the most powerful and cost-effective platform for that job. Similarly, other business data is server-based, either within local datacenters or cloud-based datacenters, or web-based, because the various business processes are served best by different computing platforms. Your business analytics repository may be on one platform, while your machine learning activity may take place on a different platform.

DATA TYPES

Do you need to start with all of your data? Certainly not, but you are well advised to start with all of the types of data that you have (or need). What does that mean? Well, to answer that, let’s borrow from a psychological analysis technique introduced in the 1950s and applied more recently to business, science and politics: the Johari Window.

20

Things that are known by you but unknown by others

Blind

Things that are unknown by you but seen and acknowledged by others

Unknown

Things that are unknown by you and also unknown by others

Figure 1: The Johari Window

The Johari Window nicely illustrates the pitfalls present in data analysis. Your end goal is obviously to make the best possible— correct—business decisions based on your data. And that is the kicker: you may only be including a portion of the information that you actually need for proper analysis and decision making. So let’s have a look at these different types of data using the Johari Window: •

But you do need your data—because it is, after all, all about the data. That’s why we’re doing this; to get something out of our data. And that something is business value—to predict or to suggest options or courses of action to achieve favorable outcomes—like improved customer satisfaction or increased profits. To paraphrase a 1980s Al Pacino movie, “First you need the data, then you get the machine learning, and then you get the business value ...”

DATA LOCATION

Open

Known to others

Unknown to self

Open Area – Your comfort zone. The arena in which you spend most of your time, measuring and analyzing known business and IT data. You may think this covers everything, but it often doesn’t.

Blind Area – Your blind spot or your organizational deficiency—something that can be addressed by new data or new inputs. For example, if you’re analyzing your IT data, but you know you’re only including data from certain systems, and not all systems. You may choose to do this because you believe the data on your Linux servers to be the only important data. However, a business manager might tell you that the mainframe data is critically important and must be included. And that might be a pain, because it’s on a different platform from your ML concerns, it’s in a different format, and so on. But what’s the point of doing ML at all if you’re ignoring important data? Well, the good news is that this oversight can be corrected by finding ways to include all available data for analysis—and there are technologies available right now that will allow you to do it. Hidden Area – Another blind spot, but different in that you (and most people in your company) are unaware of it. For example, if you wanted to analyze all data including archived data, you may not know that one of your cloud service providers archives some of your data at a third-party site on tape cartridges. This is a tougher nut to crack and could take considerable time and effort, as well as determined thoroughness. But it’s still solvable using existing platforms, tools, techniques and personnel expertise. Unknown Area – Another blind spot, but a much worse one, and one that you are unlikely to discover using any of the techniques you have ever used to solve any other business challenges up to this time. For example, an insurance company experiences slightly different claim patterns from varying clients—something that doesn’t even closely fit any of the insurer’s fraud scenarios, and may not actually be fraud related at all, but is still costing the firm money. Data analysis may not reveal anything meaningful due to incomplete data. The unknown area may well be where the greatest insights can be found, those that could yield new levels of business efficiency, revenue and profitability. Further, the unknowns may only be solvable using machine learning techniques—using all types of data.

W W W .D K L .COM


The bottom line is that you really need to know about all of your relevant data. And, in some cases, that means going beyond conventional server-based data into the realms of social media, email and text data. You can’t ignore some types of data because they’re inconvenient to obtain, and you can’t just assume that the data at hand is the only data that matters. You need to know your data, and that is best done up front. In Part II, we’ll take a closer look at AI and ML, the challenges businesses face and how if they are not currently engaging in AI and ML, they’ll be conceding competitive advantages to their rivals.

2 1




How the Internet-of-Things

24

W W W .D K L .COM


By Andrew Armstrong, Chief Customer Officer, omNovos Omni-Channel Customer Engagement

When it comes to consumer-based technology, there’s no slowing down of new “things” being introduced into the marketplace, especially when it comes to AI and digital assistants and to the emphasis the big players are placing on what can only be described as the digital home invasion. Why invasion? That’s not meant to be alarmist—more so, it’s the reality that digital homes are becoming ubiquitous.

2 5


A future that’s already arrived

You see, the digital assistant part is actually just the interface— it’s what it controls that’s truly interesting as it announces a future that has already arrived. From connected thermostats, to lightbulbs, smart TVs, appliances, and more, the influx of IoT devices is at an all-time high. Pair that with the likes of streaming television, movie and music services, and the digital assistant becomes the epicenter for all things personalized in the home. But like any new technology, many question the probability of mass adoption. It’s always easy to dismiss new technology as fads, generational, or otherwise. And for many technologies, most would be right in paying little to no attention. However, in the case of IoT and digital assistants, one needs to look no further than the stats of adoption. As of today, one in six adults in the US own digital assistants: that’s a staggering 39 million people. Even better, 65 percent of those who own them can’t imagine living without them.

A new standard for retail

So, what does all of this have to do with the world of retail you may ask? Well, to it bluntly, everything! The issue that retailers need to account for immediately is that digital assistants, AI, and IoT are becoming the new standard by which people interact with world. As smart speakers replace antiquated devices, they become the epicenter of control. By simply speaking into the air, users can turn lights on and off, control the temperature, request music to be played, and order food, items from the internet, car services, the list goes on.

A wakeup call

Now, please understand this isn’t meant as a scare tactic. Consider it more of a wakeup call. For businesses of any kind, the feeling should be the same as the buggy whip industry as it witnessed the first fleet of Fords roll off the production line and into the streets. Times change, and so must business. The good in all of this is that the technology is there and actually easier to deploy than one might think. And if that is the case, the opportunity to be exploited here is one of the greatest in history. Imagine the implications. Gone are the days of old-school advertising and marketing: spray and pray techniques that result in on-again / off-again results with little predictability. Now, your business can reside inside every person’s home. You have a direct touchpoint online, in-app, in-store—even in-home and in-car for that matter. A direct intrinsic connection where your company represents the wants and needs of an individual— personalization as never seen before. The point in all of this is not to be scared, or to be dismissive of the coming new world order. Embrace it, chase it, make it an integral part of your business process. Take advantage of this opportunity to build your company into something far more spectacular than ever imagined. Or, you could lean towards the other option that sounds like, “Siri, whatever happened to that store I used to shop at?” The choice is yours.

And, if all of that becomes the new personal standard for homes and users, where does your establishment sit within that ecosystem? Can people connect to and order from your eCommerce platform? However, even that is already antiquated. If your eCommerce platform can’t be reached via AI as of today, then there’s a short-term critical project to address right away. But, here’s the more pressing issue: Can people order food or schedule reservations at your restaurant? Can they order items from your retail establishment? Can they dictate a grocery list and ask for recipes for dinner from your grocery store? No? Not to be the bearer of bad news, but you have bigger issues coming down the pipe to address. And not just to be competitive but, more so, to survive.

26

W W W .D K L .COM


2 7


Focus your Business


Non-stop Business Continuity Instantly recover any app, any data on any cloud

veeam.com


DevOps in the Multi-Platform Data Center By Keith Allingham, Marketing Manager, DataKinetics Data Performance & Optimization

Positioned by some as the panacea of new-age IT management, promising shorter development times, improved business processes, reduced time to market, more engaged teams, greater business agility, and more, DevOps is all the rage. If you're not doing it, you're doing everything wrong. Your organization is a dinosaur. Your competition is doing it, and they're going to take away your market share, and so on. But, conversely, for two years now, people have been writing about (and demonstrating) what can short-circuit your DevOps efforts, how it can result in no real improvements at all, and can even bring your IT organization screeching to a standstill. And it’s still happening a lot. In fact, when IBM first experimented with DevOps, before there was actually a name for it, they experienced a complete failure—the main reason among many was that people didn’t properly understand their own workflows—which pretty much guarantees failure. As in, trying to fix something that you don’t understand …

30

W W W .D K L .COM


3 1


The DevOps transformation

Worst case scenarios aside, DevOps can, in fact, transform your IT organization, and even your business, for the better—but only where there is serious forethought brought to bear on it. That means ensuring the organization is making the effort to understand current workflows, implementing standard processes, insisting on cooperation between all groups, fostering a cultural change within IT and, more often than not, within the entire organization. But you've read this all before. What has been missing is the all-inclusiveness required in data centers leveraging all sorts of different platforms: I'm talking about data centers that run mainframe systems along with their UNIX, Linux, WinTel and cloud-based servers. In many cases, mainframe systems are responsible for processing 75 percent or more of a company’s revenue streams. And the mainframe group is typically not included in DevOps activity. So think about this: How can a business effectively implement DevOps in its IT organization, and not include the IT department(s) that actually generate the revenue for the business? It makes no sense at all, if you think about it.

32

W W W .D K L .COM


Bi-modal in the way

So, why on earth do companies bar the mainframe from DevOps activity? Well, it’s due to another buzzword that we’ve all heard: bi-modal-IT. Bi-modal-IT is a concept that Gartner came up with some time ago, which in actuality, really just described what was going on in business anyway—IT being divided into two distinct areas. The division is between the areas of responsibility for the IT people supporting the mainframe, and the rest of the IT organization. Really what we’re describing is the encasing of the mainframe and its supporting IT personnel into a silo. Cost freeze inside, new funding allowed only outside. The thinking behind bi-modal-IT was to control the high cost of computing on the mainframe. Unfortunately, this was based on faulty, or perhaps misleading, information. The truth is that in transaction-intense environments, the mainframe is actually the most cost-effective platform on the planet. But we digress.

Data center transparency

If you want to make DevOps work for your organization, you absolutely must have a complete understanding of all things IT before you put DevOps in place. A big part of that is knowing what IT systems are costing the business—what costs how much, and who’s using what. There are excellent visualization tools available now, but the majority of the offerings in this space are for the most part just reporting tools, as opposed to tools that offer actionable IT intelligence. They are capable visual tools that can nicely transform IT data into graphs and bar charts, which definitely add value over and above the default scenario, forcing CIOs, DBAs and managers to delve through reams of text-based log data. But beyond that, they generally don’t add any real intelligence to the picture.

What is needed is a data center-wide solution that provides IT business intelligence on all platforms: UNIX, WinTel and the mainframe. And, fortunately, there are a small number of solutions that provide this transparency into the data center. The key is to combine the reams of IT data that is being collected now, with a minimal amount of cost information on server attributes like server memory, disk space, CPU core capacity, etc., and similar cost information on mainframe CPU capacity, MSU usage, etc. Added to this would be an even more minimal amount of information on business information like department names, asset ownership, system names, LPAR names, application names, and so on. Adding cost and business information to IT information transforms the data into IT business intelligence, and from it you can easily see who uses specific resources, and what that usage costs the company. It can even shine light on the impact of new applications, changes to business processes, or even business mergers. The reality is that this transparency can transform the IT organization from a huge cost center into a window into business efficiency.

Moving forward

These solutions are truly DevOps-enabling technologies, helping CIOs and managers to better understand their own workflows, costs and IT resource usage. In fact, without this level of data center-wide intelligence, and without the transparency it provides, effective DevOps is handicapped, and relies on company personnel to provide the intelligence. Unfortunately, there are few people, if any, in most organizations who have the access to and understanding of all this information. Without a capable IT business intelligence system in place, the information needed for effective DevOps just won’t be available. If you’re dedicated to effective, systems-wide DevOps, you owe it to yourself to take a look at IT business intelligence.

It business intelligence

On the other hand, there are some tools that do offer IT business intelligence—but precious few that will give you the big picture across all IT systems. Almost all of these tools are applicable to only one side of the data center—surprise—either the distributed side of the data center, or the mainframe side. Hint: you need IT intelligence on both sides!

3 3




W W W.D KL .COM

W W W .D K L .COM

D EC E MBE R 20 18


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.