SD Times - July 2018

Page 1

FC_SDT013.qxp_Layout 1 6/25/18 11:46 AM Page 1

JULY 2018 • VOL. 2, ISSUE 13 • $9.95 • www.sdtimes.com


Full PageAds_SDT013.qxp_Layout 1 6/22/18 12:31 PM Page 2

When they had to call Mars, they called Rogue Wave first When your unmanned vehicle is 50 million miles away, nothing can fail. Which is why researchers called us. Our advanced software solutions help operate the vehicle remotely and power all mission-critical communications. Surprised? There’s more to Rogue Wave than you think.

WE’VE GOT MARS COVERED FROM A TO ZEND < WEB AND MOBILE APP DEVELOPMENT >

SECURE COMPONENTS < PLATFORM INDEPENDENT BUILDING BLOCKS >

OPENLOGIC < END-TO-END ENTERPRISE OPEN SOURCE >

KLOCWORK < APPSEC AND COMPLIANCE STATIC CODE ANALYSIS >

JREBEL < JAVA DEVELOPMENT PRODUCTIVITY >

AKANA < API MANAGEMENT >

roguewave.com/more


003_SDT013.qxp_Layout 1 6/25/18 11:01 AM Page 3

Contents

VOLUME 2, ISSUE 12 • JULY 2018

The

NEWS 6

News Watch

14

The industry reacts to Microsoft’s acquisition of GitHub

16

Docker wants to reach a wider group of developers

18

JFrog prepares for next-generation DevOps

18

IBM updates UrbanCode Deploy, Velocity for ‘Day 2’

20th Anniversary of Open

Source

The ubiquity of shared code

page 20

AI in Automated Testing: Try it, but tread carefully 30

Containing App Vulnerabilities

33

Protecting the Service Mesh with Istio

35

What Edge Computing Means for I&O

page 8

COLUMNS 44

GUEST VIEW by Pete Johnson Iterations are the currency of software innovation

45

ANALYST VIEW by Michael Azoff The next evolution in application life cycle management

46

INDUSTRY WATCH by David Rubinstein Open source has won the day

Application security needs to shift left

page 37

Software Development Times (ISSN 1528-1965) is published 12 times per year by D2 Emerge LLC, 80 Skyline Drive, Suite 303, Plainview, NY 11803. Periodicals postage paid at Plainview, NY, and additional offices. SD Times is a registered trademark of D2 Emerge LLC. All contents © 2018 D2 Emerge LLC. All rights reserved. The price of a one-year subscription is US$179 for subscribers in the U.S., $189 in Canada, $229 elsewhere. POSTMASTER: Send address changes to SD Times, 80 Skyline Drive, Suite 303, Plainview, NY 11803. SD Times subscriber services may be reached at subscriptions@d2emerge.com.


004_SDT013.qxp_Layout 1 6/22/18 1:04 PM Page 4

®

Instantly Search Terabytes dtSearch’s document filters support: ‡ popular file types ‡ emails with multilevel attachments ‡ a wide variety of databases ‡ web data

www.sdtimes.com EDITORIAL EDITOR-IN-CHIEF David Rubinstein drubinstein@d2emerge.com NEWS EDITOR Christina Cardoza ccardoza@d2emerge.com SOCIAL MEDIA AND ONLINE EDITOR Jenna Sargent jsargent@d2emerge.com INTERN Ian Schafer ischafer@d2emerge.com ART DIRECTOR Mara Leonardi mleonardi@d2emerge.com

2YHU VHDUFK RSWLRQV LQFOXGLQJ ‡ efficient multithreaded search ‡ HDV\ PXOWLFRORU KLW KLJKOLJKWLQJ ‡ forensics options like credit card search

CONTRIBUTING WRITERS Alyson Behr, Jacqueline Emigh, Lisa Morgan, Jeffrey Schwartz CONTRIBUTING ANALYSTS Cambashi, Enderle Group, Gartner, IDC, Ovum

CUSTOMER SERVICE SUBSCRIPTIONS subscriptions@d2emerge.com

Developers: ‡ $3,V IRU NET, C++ and Java; ask about new cross-platform NET Standard SDK with Xamarin and NET Core ‡ 6'.V IRU :LQGRZV 8:3 /LQX[ 0DF L26 LQ EHWD $QGURLG LQ EHWD ‡ )$4V RQ IDFHWHG VHDUFK JUDQXODU GDWD FODVVLILFDWLRQ $]XUH DQG PRUH

.

.

.

ADVERTISING TRAFFIC Mara Leonardi adtraffic@d2emerge.com LIST SERVICES Shauna Koehler skoehler@d2emerge.com REPRINTS reprints@d2emerge.com ACCOUNTING accounting@d2emerge.com

ADVERTISING SALES

Visit dtSearch.com for ‡ KXQGUHGV RI UHYLHZV DQG FDVH VWXGLHV ‡ IXOO\ IXQFWLRQDO HQWHUSULVH DQG developer evaluations

The Smart Choice for Text Retrieval® since 1991

dtSearch.com 1-800-IT-FINDS

PUBLISHER David Lyman 978-465-2351 dlyman@d2emerge.com

PRESIDENT & CEO David Lyman CHIEF OPERATING OFFICER David Rubinstein

D2 EMERGE LLC 80 Skyline Drive Suite 303 Plainview, NY 11803 www.d2emerge.com


Full PageAds_SDT013.qxp_Layout 1 6/22/18 12:34 PM Page 5


006-7_SDT013.qxp_Layout 1 6/22/18 1:06 PM Page 6

6

SD Times

July 2018

www.sdtimes.com

NEWS WATCH Wolfram releases neural net repository The software company Wolfram Research launched a public repository for trained and untrained neural network models. The Wolfram Neural Net Repository builds on the company’s Wolfram Language neural framework to store neural net models and enable their immediate use for evaluation, training, visualization and transfer learning. The Wolfram Language neural network framework includes models for automated machine learning, representation, operations, basic layers,

recurrent layers, sequencehandling-layers, training optimization layers, and managing data and training. “Neural nets have generated a lot of interest recently, and rightly so: they form the basis for state-of-the-art solutions to a dizzying array of problems, from speech recognition to machine translation, from autonomous driving to playing Go. Fortunately, the Wolfram Language now has a state-ofthe-art neural net framework (and a growing tutorial collection). This has made possible a whole new set of Wolfram Language functions, such as FindTextualAnswer, ImageIdentify, ImageRestyle and FacialFea-

Microsoft gives a look at Visual Studio 2019 Microsoft revealed what’s next for its suite of development tools with the announcement of Visual Studio 2019. VS 2019 will continue with the company’s vision of making VS faster, more reliable, more productive, easier to use, and easier to get started with. The company is just beginning the early planning phases of the release, but says developers can expect improved refactoring, navigation, bugging capabilities, faster solution load, and faster builds. “But also expect us to continue to explore how connected capabilities like Live Share can enable developers to collaborate in real time from across the world and how we can make cloud scenarios like working with online source repositories more seamless. Expect us to push the boundaries of individual and team productivity with capabilities like IntelliCode, where Visual Studio can use Azure to train and deliver AIpowered assistance into the IDE,” John Montgomery, director of program management for Visual Studio, wrote in a post.

tures. And deep learning will no doubt play an important role in our continuing mission to make human knowledge computable,” the Wolfram team wrote in a post.

Perfecto now supports Progressive Web Apps Perfecto is bringing its automated continuous testing solution to Progressive Web Apps (PWAs) with its newly announced support. PWAs are apps that run in the web browser, and do not have to be accessed through the app store. According to Perfecto, PWAs improve user experience, grow engagement and increase conversions. The company expects PWA adoption to increase over the next year based on a recent survey it conducted. Forty-one respondents reported they plan to add PWA to their solutions that already follow responsive web design, and an additional 32 percent are researching the shift. “PWA is one of the latest web development market innovations being embraced today after responsive web design to enhance user experience,” said Roi Carmel, chief strategy officer at Perfecto. “Our Automation Coverage Extension capabilities provides us the architectural advantage to provide DevOps teams with what they need to automate PWA testing to deliver flawless user experiences.”

Facebook releases Sonar to open source Facebook has made its extensible debugging tool, Sonar, available as open source. Sonar was originally created to help Facebook engineers

manage the complexity of working with multiple different modules. According to the company, Sonar provides a framework where experts and developers can convey important information to users. It also provides engineers with an intuitive way of inspecting and understanding the structure and behavior of iOS and Android applications. When it was started three years ago, Sonar was built upon Stetho, which is an Android debugging bridge built in Chrome’s developer tools. It added new features, provided a richer user experience, and works across both iOS and Android. Facebook recommends the use of Sonar over Stetho for most use cases going forward.

Google App Maker low-code platform now available G Suite users can now build custom apps to meet their businesses needs with the availability of App Maker, G Suite’s low-code environment. The solution was first announced in November of 2016. “Analysts estimate that the right custom mobile app can save each employee 7.5 hours per week (that’s a week’s worth of lunch breaks!). Yet, too few businesses have the means, let alone the resources, to invest time and effort in building custom apps. Why? Because their IT budget centers on big enterprise apps like CRM, ERP and SCM and beyond those priorities, IT executives’ attention focuses on security and governance,” Geva Rechav, product manager of App Maker, wrote in a post. App Maker tackles that


006-7_SDT013.qxp_Layout 1 6/22/18 1:06 PM Page 7

www.sdtimes.com

problem by providing for lineof-business teams to build apps for things like requesting purchase orders or filing and resolving help desk tickets.

GitLab premium services free for open source, education Following a major boost in activity spurred on by Microsoft’s acquisition of GitHub and Apple’s announcement of XCode integration for GitLab, GitLab has announced that its two most robust project management offerings, GitLab Ultimate and GitLab Gold, are now free for educational institutions and open source projects. While GitLab Ultimate is self-hosted and GitLab Gold is the company’s primary SaaS offering for hosting on GitLab.com, both include all of the features of the GitLab Core, Starter and Premium services, plus more advanced management and security features alongside access to all features of GitLab including Epics, Roadmap, Static Application Security Testing, Container Scanning and others. Missing from the free features is support, but it can be purchased at a steep discount for a monthly fee, the company explained.

Android P beta 2 includes APIs, ML Google has announced the second beta release of its upcoming operating system, Android P. This update includes the final Android P APIs, the latest system images, and updated developer tools to prepare for the consumer release. This beta release utilizes machine learning in many of its new features, according to the

July 2018

SD Times

Report: The top three programming languages of 2018 are Java, JavaScript and Python Java remains the most popular primary programming language, but JavaScript is the most used programming language overall. That is according to a recently released report from JetBrains on the State of the Developer Ecosystem in 2018. The report surveyed more than 6,000 developers from 17 countries to reveal the trends driving the world of coding this year. Topics covered included programming languages, development environments, databases, issue tracking, continuous integration, deployment and DevOps. According to the report, Java, JavaScript and Python are the top three programming languages this year, and Go is the most promising language. Twenty percent of developers use multiple versions of Go at the same time, and 26 percent set up their GOPATH per project. The top Go frameworks include Gin, Beego, Echo and Buffalo. While 38 percent of developers have no plans to adopt any new languages this year, the top languages respondents have started to learn in the last year include Python, JavaScript, Java, Go, TypeScript and Kotlin. company. In collaboration with DeepMind, Google is adding an Adaptive Battery feature that uses machine learning to prioritize system resources for apps that users care about most. It also adds App Actions, which is a way to help raise app visibility and drive engagement. It uses machine learning to surface apps to users at the right time, based on an app’s semantic intents and the user’s context.

NativeScript 4.0 eases mobile development Progress announced the latest version of its open-source framework for cross-platform, native iOS and Android app development at its annual conference ProgressNEXT in Boston in May. NativeScript 4.0 is designed to ease mobile development with a new streamlined development workflow, support for advanced navigation scenarios and deeper integration with Vue.js. According to the company, 70 percent of the developer

population uses JavaScript as their preferred programming language of choice. NativeScript 4.0 aims to help developers utilize their existing skill sets with new core framework updates, toolings and plugins. The latest release features Angular 6 support, the latest version of the cross-platform framework. NativeScript 4.0 also enables users to build web and mobile apps with the Angular CLI and Angular Schematics, the frameworks workflow tool for development extensibility and reusability. Other features include the ability to enable LifeSync with Webpack simultaneously for a better development experience, and asset generation and updated templates to eliminate things like image editing.

Angular for Designers initiative progressing Angular is continuing its work to bring developers and designers together. The company announced the Angular for Designers initiative at its ngconf in May. While the public launch is still months away, the

team has partnered with Google’s UX engineers to make its vision possible. “Designers put a lot of time into designing components and features for their products, however there often still remains a gap between a designer’s vision and a developer’s reality,” Blair Metcalf, UX engineer at Google, wrote in a post. “Designers are ready for a dynamic, data-driven way of working and the web is capable of delivering the tools they want.” As part of the initiative, the UX engineering team is working on a WYSIWYG prototyping tool that will eventually give designers the ability to create prototypes without writing code. “Many articles have been written about the unresponsive nature of current design tools that lack real data; designers are not able to work in a medium that feels like their final product. Instead, designers are limited to creating static mockups that don’t convey experiences well and are often on their own if they want to build a more interactive prototype,” Metcalf wrote. z

7


008-12_SDT013.qxp_Layout 1 6/25/18 10:07 AM Page 8

8

SD Times

July 2018

www.sdtimes.com

AI in Automated Testing:

Try it, but tread care f So far, hype around AI in testing is way ahead of the adoption and execution, creating a lot of confusion in the industry. On top of that, analysts say the landscape will look markedly different in five years BY LISA MORGAN

machine learning, and related technologies are more popular than ever, but when it comes to automated testing, the hype outpaces the reality. While there are a few automated testing solutions that take advantage of AI, and perhaps machine learning or deep learning, the level of chatter might lead one to believe that such tools and services are more pervasive than they actually are yet.

AI,


008-12_SDT013.qxp_Layout 1 6/22/18 2:17 PM Page 9

www.sdtimes.com

fully “There’s a lot of confusion in the market. Everything and everybody is [claiming to have] AI or machine learning, but when you go to a trade show and ask, ‘What do you really do?’ they say, ‘Record and playback,’ or ‘static code analysis.’ Neither of those are really AI,” said Theresa Lanowitz, founder and head analyst of market research firm Voke. “We’re a ways from people implementing AI technology in the enterprise to help them with their development, test and operations, but there are tools going in that direction.” For example, there’s AppliTools Eyes for UI testing; AutonomIQ and Functionize autonomous testing solutions; Mabl, a machine-learning test automation service for web apps and websites and Parasoft SOAtest for API testing. It’s a good idea to experiment with the tools to understand the scope of their capabilities. It’s also wise to learn a bit about AI, machine learning and deep learning to understand what they do. “I think it would be a mistake to use it without understanding its limitations,” said Bob Binder, senior software engineer at the Carnegie Mellon Software Engineering Institute.

How AI can help automated testing Software teams are under constant pressure to deliver better quality products in ever-shorter timeframes. To do that, testing has shifted both left and right, and the automation of tests has become critical. However, in the meantime, traditional test automation has become a bottleneck. “Over the past several years, we’ve told testers ‘you have to become more technical, you have to learn how to code,’” said Voke’s Lanowitz. “They’ve

now become Selenium coders, and that’s really not the best use of their time. If you’re an enterprise, you want to take those expensive resources and have them developing products, not just test cases.” Rather than having engineers write scripts, there are now solutions that can automatically generate them. “When we talk about test automation today, for the most part, we are really talking about the automated ‘execution’ of tests. We’re not talking about the automated ‘creation’ of tests,” said Joachim Hershmann, research director at Gartner. One approach is to provide an autonomous testing solution with a test case written in a natural language and it will autonomously create the test scripts, test cases, and test data. The autonomous nature of the system frees up testers to do other things such as explore new technologies, advocate for the customer or line of business, as well as be more influential and strategic, said Voke’s Lanowitz. Web and mobile app developers have options that help ensure the user experience is as expected. “The AppliTools Eyes product uses the same style of algorithms that Google would use for facial recognition or people are using for other types of optical recognition,” said Thomas Murphy, research director at Gartner. “Using Eyes, I can tell you whether the application is working right and whether it looks the way you expect it to.” Software development firm Gunner Technology has used AI-aided automated testing a couple of times. “So much of it is repetitive because you’re doing the same thing over and over again,” said Dary Merkens, CTO of Gunner Technology, a custom software development shop. “AI can look at your application, parse out the elements on a page and apply pre-existing heuristics to generate thousands of tests itself, which would be a tremendous human effort.” At the present time, Gunner Technology is developing a mobile app that will launch at the end of the summer for which it is also using AI-based automated testing.

July 2018

SD Times

“There are these little edge cases where you don’t know what is going to happen on a pixel-by-pixel basis,” said Merkens. The benefit of machine learning is pattern identification. In the case of automated testing, that means, given the right training, it is able to distinguish between a failed test and a passed test, although there are other interesting possibilities.

‘When we talk about test automation today, for the most part, we are really talking about the automated ‘execution’ of tests.’ —Thomas Murphy

“It could be used to understand which tests should be run based on a change that was made or risk of going into production with a particular release,” said Gartner’s Murphy. “This is where the more that people use cloud-based tools that allow them to run analytics across anonymized data, you can start looking for patterns and trends to help people to understand what to focus on, what to do. We’re in the early phase of this.”

Getting started Some vendors are promoting the merits of AI and machine learning, whether their product actually uses the technologies or not. As with other types of products, the noise of similar sounding offerings can make it difficult to discern continued on page 10 >

9


008-12_SDT013.qxp_Layout 1 6/22/18 2:17 PM Page 10

10

SD Times

July 2018

www.sdtimes.com

5 Intelligent Test Automation Tools AI and machine-assisted automated testing tools are relatively new. The only way to understand exactly what they do and how their capabilities can benefit your organization is to try them. Following are five of the early contenders: n AppliTools Eyes is an automated visual AI testing platform targeted at test automation engineers, DevOps and front-end developers who want to ensure their mobile, web and native apps look right, feel right and deliver the intended user experience. n AutonomIQ is an autonomous platform that automates the entire testing life cycle from test case creation to impact analysis. It accelerates the generation of test cases, data and scripts. It also self-corrects test assets automatically to avoid false positives and script issues. n Functionize is an autonomous cloud testing platform that accelerates test creation and executes thousands of tests in minutes. It also enables autonomous test maintenance.

< continued from page 9

between what’s real and what isn’t, particularly for the uninitiated. “Really do due diligence and a POC to make sure it’s going to reduce or eliminate that human interaction you have to have,” said Voke’s Lanowitz. “Realize this is new technology and new innovation, because we haven’t had a lot of innovation in the testing space. Tools have gotten less expensive, they produce fewer false positives, but we haven’t had a lot of innovation. This is innovation.” While it’s fun to be technologically curious, it’s also wise to consider how the organization could benefit from such a product or service and whether the organization is actually ready for it. Machine learning requires data which may not be readily available. Alternatively, if the data is available, it may not have been curated because no one knows how to make sense out of it. “Increasingly, people talk about shift right and the idea is essentially I have all this data about how end users are using my application, where errors are occurring and the load in the system. I can use AI to make it much more meaningful,” said Gartner’s Herschmann. “The whole notion of testing and QA broadened in scope from the ideation phase to the requirements phase all the way back to when things are live in production. I can use the data in a machine learning context to

n Mabl (above) is machine learning-driven test automation for web apps that simplifies the creation of automated tests. It also identifies regressions and automatically maintains tests. n Parasoft SOAtest API testing is not a new product. However, the latest release introduces AI to convert manual UI tests into automated, scriptless API tests. —Lisa Morgan

identify patterns, and then based on the patterns, I can make certain changes. Then rinse and repeat all the time.” It’s a mistake to underestimate the dynamic nature of machine learning, because it’s a continuous process as opposed to an event. Common goals are to teach the system something new and improve the accuracy of outcomes, both of which are based on data. For example, to understand what a test failure looks like, the system has to understand what a test pass looks like. Every time a test is run, new data is generated. Every time new code is generated, new data is generated. The reason some vendors are able to provide users with fast results is because the system is not just using the user’s data, it’s comparing what the user provided with massive amounts of relevant, aggregated data. “Three or four years ago, Google said that their code base then was like 100 million lines and it’s well past that now. Every day, that code base is growing linearly and so is their test code base, so that means that test execution is growing exponentially and at some point it’s no longer affordable,” said Gartner’s Murphy. “They built tools to determine which tests need to be fixed or thrown out, which tests are of no value anymore, what tests should be run based on what changes have been checked into [a] build. These things are what organizations have to look at and

now you’re seeing companies other than Google do this.”

What to expect along the way While autonomous testing and AI technologies aren’t new, the combination of them is in the early stages. More and different types of products will hit the market in the coming months and years. Meanwhile, there will be a lot of trial and error involved by end users and vendors. “If you look at the Gartner Hype Cycle, all of the technologies that are in some shape or form related to machine learning are all just climbing the slope. Basically that means they are still ahead of getting into the trough of disillusionment,” said Gartner’s Herschmann. “I think we will see people fail at using these kinds of technologies [because] there’s a lot of over-promise. We tell people you’ve got to have the right expectations about what you can do with this because yes, we’ve seen some very cool things like Google and Facebook and some of the other big guys, but keep in mind this is very, very narrowly focused. We’re decades away from anything that’s general purpose AI.” Voke recommends taking a long-term view of the technology and consider how it’s going to impact the mix of skills in the organizations and workflows. “Understand where skills can go and how you can use the skills to benefit the continued on page 12 >


Full PageAds_SDT013.qxp_Layout 1 6/22/18 12:35 PM Page 11

Learn, Explore, Use Your Destination for Data Cleansing & Enrichment APIs

Global Address

ID Verification

Global Email Global IP Locator DEVELOPER

Global Phone Property Data

Global Name

Business Coder

Your centralized portal to discover our tools, code snippets and examples. RAPID APPLICATION DEVELOPMENT

REAL-TIME & BATCH PROCESSING

TRY OR BUY

FLEXIBLE CLOUD APIS

Convenient access to Melissa APIs to solve problems with ease and scalability.

Ideal for web forms and call center applications, plus batch processing for database cleanup.

Easy payment options to free funds for core business operations.

Supports REST, JSON, XML and SOAP for easy integration into your application.

Turn Data into Success – Start Developing Today! Melissa.com/developer 1-800-MELISSA


008-12_SDT013.qxp_Layout 1 6/22/18 2:18 PM Page 12

12

SD Times

July 2018

www.sdtimes.com

< continued from page 10

overall software life cycle,” said Lanowitz. “You can’t absolve all your responsibility and say ‘we have this new tool, so we don’t have to have a test engineer sitting there monitoring it.’ The role changes. Also, you can’t just plug these things in and let it them go. You can’t assume they’re perfect out of the box. That’s where the idea of training comes in.” While it’s common to focus on the technology aspect of this or any other technology for that matter, what’s often underestimated are the impacts on people and processes. “You can’t have one group trying to control the software life cycle through a turf war. This is going to change the way software is developed, tested, deployed, so I think the organization has to be ready to embrace this innovation and not stay stuck,” said Lanowitz. “We know from our research that there’s not a lot of automated testing [or] software release management going on. People say they’re releasing more frequently, but what they’re doing is still very manual so there’s a lot of room for error. People have to be ready for this mode of more autonomous, more artificially intelligent, guided solutions within their organization to be ready to embrace it.” Gartner’s Murphy said organizations should plan for a four-month window to understand how the new tools differ from traditional tools, how to apply the new tools well and get the staff trained. “Expect to get some positive benefits over a year’s worth of time, but expect it to take some time to get it going and have things move forward,” said Murphy. Don’t get too comfortable, though, because many of the companies behind AI and machine-learning-aided automated testing tools and services are startups. They may look hot or be hot now, but some of them may fail while others may be acquired by industry titans. “Most of these new AI-ish guys are $5 million and under. They’re not enterprise-scale types of organizations,” said Murphy. “Open source requires some assembly, sometimes major assembly, so it’s a market that’s going to be changing

‘These are big transitions culturally and technically so you should move forward in an agile, incremental fashion: pick a team, a project, school them up and see how it works.’ —Joachim Herschmann, Gartner

quite a bit over the next few years.” Meanwhile, Lanowitz thinks some automated testers may evolve into automated testing system trainers. “If you’re currently an automated test engineer, this is an opportunity to increase your skill set because what you’re going to be doing is not just using a tool, you’re going to be training that software that fits within the scope of what you want to do in your own organization,” said Lanowitz.

Potential pitfalls to avoid The Silver Bullet mentality is alive and well in the AI space, and that will trickle down to the automated testing space. Despite the fact there are no silver bullets, massive amounts of industry hype continue to set unrealistic expectations. Organizations sometimes jump on the latest bandwagon without necessarily understanding what they’re adopting, why, and what their current state really is. They want answers that will help them navigate the new territory faster and more intelligently, but they don’t always know which questions to ask. “If you’re only looking at what you’re spending versus what others are spending, how is that going to affect if you’re getting better or not?” said Gartner’s

Murphy. “I try to get clients to understand what would make them better, such as what their weaknesses are, but they often don’t have a very good handle on that. These are big transitions culturally and technically so you should move forward in an agile, incremental fashion: pick a team, a project, school them up and see how it works.” Don’t plan a wholesale shift that involves the entire organization, in other words. Start small, experiment, make mistakes, learn from the mistakes, build upon successes, and keep learning. One Gartner client spent a year experimenting with what was available and doing a pilot. The results were not as expected, but instead of considering the endeavor a failure, the organization realized the tool that had a lot of potential that would probably take another year or two to realize. “I think [the ability to pivot] is more important here than in many other technologies because you can’t just drop this thing in tomorrow and then be good for the next five years,” said Gartner’s Herschmann. “This is an investment in the sense that you need to adapt to how it’s changing.” Apparently, the lead analysts at Gartner think that the AI landscape may look very different five years from now. There’s a lot of innovation and a lot of venture capital and private equity flowing into the space. “Don’t expect that this is going to be the only way forward for the next 10 years,” said Gartner’s Herschmann. z


Full PageAds_SDT013.qxp_Layout 1 6/22/18 12:35 PM Page 13

$PHULFDV (0($ 2FHDQLD VDOHV#DVSRVHSW\OWG FRP


014-16_SDT013.qxp_Layout 1 6/22/18 2:14 PM Page 14

SD Times

July 2018

www.sdtimes.com

The industry reacts to Microsoft’s acquisition of GitHub BY CHRISTINA CARDOZA

When Microsoft last month announced it had acquired the open-source code repository GitHub for $7.5 billion, Microsoft’s CEO Satya Nadella said that with developers on the forefront of this new digital era, the company wanted to ensure it could help them learn, share and work together to build software solutions. “Developers are the builders of this new era, writing the world’s code. And GitHub is their home,” he wrote in a post. Nadella cited three opportunities for the two companies going forward: 1. To empower developers, 2. To accelerate the enterprise use of GitHub and 3. To bring Microsoft developer tools and services to new audiences. Since the news was announced, a number of thought leaders have put their two cents in about the acquisition. Overall, it seems the majority of the industry agrees that this is a natural fit for the two companies. Here is what some thought leaders had to say:

n Jim Zemlin, executive director of the

Linux Foundation:

Should the open source community be concerned? Probably not. Buying GitHub does not mean Microsoft has engaged in some sinister plot to ‘own’ the more than 70 million open source projects on GitHub. Most of the important projects on GitHub are licensed under an open source license, which addresses intellectual property ownership. The trademark and other IP assets are often owned by a non-profit like The

Photo cortesy of Microsoft

14

Chris Wanstrath, left, Github CEO and co-founder, joins Microsoft’s Nat Friedman, right, who will become GitHub CEO, and Microsoft CEO Satya Nadella.

Linux Foundation. And let’s be quite clear — the hearts and minds of developers are not something one ‘buys’ — they are something one ‘earns.’ n Sid Sijbranij, CEO of GitLab:

The way developers produce, deliver and maintain code has changed significantly in the last ten years and we applaud GitHub for being a driving force supporting the vast independent developer community through this evolution. This acquisition affirms the global importance of software developers and their influence in the enterprise. Microsoft likely acquired GitHub so it could more closely integrate it with Microsoft Visual Studio Team Services (VSTS) and ultimately help drive compute usage for Azure. n Mik Kesten, CEO of Tasktop:

Microsoft has long known how to value the hearts and minds of developers. Though it's under Nadella that they seem to be getting the value of OSS. Not a bad price tag given this gets them

a good chunk of the world’s software!

n Sacha Labourey, CEO of CloudBees

The net of the news is that this acquisition confirms the strategic value of DevOps. As software is eating the world, developers have become the new kings. DevOps vendors that have been able to show great developer adoption as well as build associated revenues are incredibly strategic. The immediate question is whether Microsoft is a good destination for GitHub? The answer is easy: I can’t think of a better destination for GitHub than “The New Microsoft.” The New Microsoft totally gets developers. GitHub has built an amazing social network for developers who are likely not going to be in a hurry to leave this buzzing hive anytime soon for some temporary FUD. Furthermore, I predict Microsoft will be further investing to pursue GitHub’s mission, including adding what has been lacking the most continued on page 16 >


Full PageAds_SDT013.qxp_Layout 1 6/22/18 12:46 PM Page 15


014-16_SDT013.qxp_Layout 1 6/22/18 2:14 PM Page 16

16

SD Times

July 2018

www.sdtimes.com

< continued from page 14

to GitHub in the last year: a leader at their top. n Brian Fox CTO of Sonatype:

Microsoft’s acquisition of GitHub shows that the developer is king, collaboration is critical to innovation, and open source has truly taken center stage. While Microsoft initially viewed open source programs with skepticism, fearing competition with its proprietary model, it has quickly realized that open source, and its developers, are what’s driving today’s application economy. Developers build the infrastructure that underpins our lives and businesses; collaboration between them is crucial to unlock open source’s full potential. Any and every initiative that helps to build better software faster, should always be supported. With Microsoft’s resources behind a great company like GitHub, the future of secure, quality open source looks brighter than ever. n Setu Kulkarni, VP of product & cor-

porate strategy for WhiteHat Security:

Microsoft’s purchase of GitHub is a visionary acquisition that can further extend their cloud-native design-devel-

op-run ecosystem play. With GitHub and subsequent integrations with the breadth of the Microsoft stack, Microsoft hopes to gain wider adoption across the developer community and enterprises alike. n Udi Nachmany, VP of sales and business development for Cloud 66:

Microsoft will now own large parts of the software delivery chain: GitHub (source control), Visual Studio (IDE), Azure and Azure Stack (compute), package management (Helm), and more. Given Microsoft's increased activity around Kubernetes, CI and container delivery seem to be gaps in that story—this is a great opportunity for providers of complementary tools to offer unique value to customers. n Sam Basu, a member of the develop-

er relations team for Progress:

This was destined to happen. Microsoft is by far the biggest contributor to GitHub and cares too much about its OSS investments after the death of CodePlex. n Randy Bias, VP of technology and

strategy at Juniper Networks:

what Microsoft has always been about. This deal proves we are deeply into the cloud era and that businesses must embrace and transform themselves or face the consequences. In addition, Atlassian and GitLab both revealed that since the news went live, they have seen a uptake in teams switching over to their solutions. Atlassian stated in a post that “After the announcement of Microsoft’s acquisition of GitHub, Bitbucket started to see a spike in the number of GitHub users migrating their repositories to Bitbucket. Why? Many users understand they can get everything they had on GitHub in Bitbucket plus more, and at a lower cost. Tens of thousands of customers — including 60 of Fortune 100 — turn to Bitbucket as their code collaboration solution.” According to GitLab: “It has been a crazy 24 hours for GitLab. More than 2,000 people tweeted about #movingtogitlab. We imported over 100,000 repositories, and we’ve seen a 7x increase in orders.” z

GitHub is a natural extension of

Docker’s goal: Reaching more developers BY DAVID RUBINSTEIN

Docker unveiled new tools and functionality to enable a wider range of developers to create containerized applications, and for IT to manage applications across clouds and platforms. The announcements were made at the company’s 5th DockerCon conference, held this year in San Francisco. For developers, Docker announced Docker Desktop GUI, template-based workflows designed to ease developers into containerization. The templates mean developers don’t have to learn to write Dockerfiles or Compose files, and let teams create models for collaboration. “This is a new way to build containerized applications for people with little knowledge of creating containers,” said David Messina, chief marketing officer at Docker. On the IT side, Docker Enterprise Edition gets the ability to facilitate feder-

ated application management. Through the one interface, organizations can manage and secure their applications whether running on-premises, in multiple clouds and across hosted Kubernetes-based cloud services, according to the company’s announcement. “Containers are portable, but management is not,” explained Messina. The solution, he added, solves the problem of running Amazon Kubernetes Services or Amazon EKS with other solutions and having to log into each one to view the application. “You need to be able to see all the places your application is deployed at once,” he said. “You need an aggregated view and the ability to act on” that information. The solution also brings an enterprise level of security, policy, enforcement, and governance. “A key part is that Docker has never been tied to an operating system or virtualization model,” added Jen-

ny Fong, Docker’s director of product marketing. She added federated asset management enables users to determine who has touched the application, or where the app came from, as well as offering features such as vulnerability scanning, image signing and security. Finally, Docker demonstrated Windows Server Containers in Kubernetes with Docker Enterprise Edition. “We’ve been working with Microsoft in engineering since 2014,” Messina said. “We began running Windows and Linux in the same Docker Enterprise cluster since August 2017.” The solution gives organizations the ability to use Kubernetes in .NET and Windows Serverbased applications with Docker EE. “Containerization is now central to an organization’s IT strategy, to their cloud strategy” Messina said. “Developers need to know how to work containers into their applications.” z


Full PageAds_SDT013.qxp_Layout 1 6/22/18 12:47 PM Page 17

$PHULFDV (0($ 2FHDQLD VDOHV#DVSRVHSW\OWG FRP


018_SDT013.qxp_Layout 1 6/22/18 1:12 PM Page 18

18

SD Times

July 2018

www.sdtimes.com

DEVOPS WATCH

JFrog prepares for next-generation DevOps Enterprise+ platform provides binary life cycle management BY JENNA SARGENT

JFrog has released its universal software binary platform Enterprise+. The platform is designed to provide continuous updates for any language and destination as well as manage the lifecycle of binaries for cloud-native apps. “Users today have zero tolerance for broken, cumbersome, and slow software updates. We all strive to achieve life without outages,” said Shlomi Ben Haim, CEO and co-founder of JFrog, in a company announcement. “Most of the code in today’s world is already written, thus the real DevOps challenge is to enable a continuous artifacts flow. JFrog is in the business of fast, automated, and secure software releases, our priority is to aid the next generation of DevOps by furthering continuous updates.” Enterprise+ provides an end-to-end pipeline for managing binary artifacts from development to production deployment. It manages lifecycles for both binaries of cloud-native apps and traditional IT apps, and can also be used for

The JFrog Platform gives users an end-to-end pipeline to control the flow of binaries.

mobile development and IoT devices. According to the company, the solution aims to give users full control over the storage, promotion, security, and distribution of binary releases to remote endpoints. In addition, it includes a unified dashboard designed to give users access to analytics throughout the software development lifecycle. Other features

include integrated authentication and authorization management to ensure security across different regions. Lastly, the company explained Enterprise+ will include other JFrog solutions such as JFrog Artifactory Binary Repository Manager, JFrog Xray, JFrog Distribution, JFrog Artifactory Edge, JFrog Mission Control, JFrog Insight, and JFrog Access Federation. z

IBM updates UrbanCode Deploy, Velocity for ‘Day 2’ BY CHRISTINA CARDOZA

IBM is updating is DevOps solutions with the release of UrbanCode Deploy 7.0 and UrbanCode Velocity 1.0. “A growing number of companies are moving beyond the traditional DevOps approaches of applying lean and agile in isolated projects or teams. Today, the name of the game is what’s being dubbed as DevOps ‘Day 2,’ a world that emphasizes scalability and teamwork. In such environments, development teams shift right, operations teams shift left and together they adeptly deliver and manage multiple, complex projects at the same time,” IBM wrote in a post. “DevOps is ten years young and it is now

the moniker for software-driven innovation done well. As enterprises scale DevOps to address Day 2 challenges, the IBM UrbanCode team seeks to help them be as effective as possible.” UrbanCode Deploy is a deployment and release automation solution. The latest version features a new endpoint management layer designed to help teams operate at scale. “This means you can manage tens of thousands of endpoints with a single UrbanCode Deploy server. This will save our clients money, and enable a single, centralized server to manage all the deployment processes and templates,” IBM wrote. UrbanCode Velocity is designed to

collect data and deliver actionable insights, according to IBM. The latest release features deeper insights into builds, deploys, tests, releases and trends. “UrbanCode Velocity is far more than just reporting. As different teams implement continuous delivery they often start with their own pipeline tools. This is especially true given that cloud providers often supply simple deployment tools as part of the platform. IBM does this in public cloud with IBM Cloud Continuous Delivery and in IBM Cloud Private with Microclimate (which includes Jenkins),” the company wrote. z


Full PageAds_SDT013.qxp_Layout 1 6/22/18 12:47 PM Page 19

Software delivery; it’s a team sport

In software delivery if teams aren’t rowing together, they’re rowing in circles. Integrate the complex network of disparate tools that you use to plan, build and deliver software at scale. $XWRPDWH WKH ćRZ RI product-critical information across your entire software delivery process. Help your teams pull together.

Connect

Visualize

Measure

tasktop.com


020-28_SDT013.qxp_Layout 1 6/22/18 4:55 PM Page 20

20

SD Times

July 2018

www.sdtimes.com

The ubiquity of

It’s taken a generation, but today, whether you’re using an app, checking out at a store or even driving your car, chances are excellent that the software underneath uses open source BY CHRISTINA CARDOZA

“W

hy is open source important? That’s like asking why is gravity important,” stated Brian Behlendorf, a leading figure in the open-source software movement, and executive director for the blockchain consortium Hyperledger. While this year marks the 20th anniversary of open source, it is hard to imagine a time before open-source software. Today, it’s difficult to find a solution or piece of software that was created without some open-source components. According to Behlendorf, the largest companies out there including Amazon, Google and Facebook, would not be possible if it wasn’t for open source, or if it was possible, their solutions would

be much more expensive. “Today, open source is the assumed default for any new software development,” said Simon Phipps, president of the Open Source Initiative (OSI). “Having the freedom to meet your own needs with software collaboratively with a community is the best way of dealing with large complex systems in large complex stacks, and consequently everyone is doing that. Everyone would rather work in a open source environment then try to replicate the attributes of the open-source environment in a proprietary space.” Behlendorf recalls first being attracted to the open-source space because he didn’t really trust his own coding abilities, and the idea that there were other developers out there willing to read his

code and help him fix it was a “godsend.” “For many people who became programmers after the ‘90s, working publicly, pulling down open-source code, sharing improvements back if you made any or even taking your work and releasing it publicly became the default. It was what was expected,” he said. However, being able to share and collaborate openly on software wasn’t always possible. OSI’s vice president VM Brasseur describes the early days as the “wild west” where programmers were building tools for programmers, driven by the philosophical belief that “sharing is good and right.” According to Brian Fox, CTO of Sonatype, there were no campfires in the beginning. There were no places for developers to come together except for


020-28_SDT013.qxp_Layout 1 6/22/18 4:58 PM Page 21

The

shared code

forums, and even with forums it was really hard to share code. It took the release of solutions like SourceForge, and later GitHub, to really make opensource software more accessible, Fox explained. “GitHub has made it super easy for people to fork your stuff and contribute your code back in pull requests. That was one of the major innovations GitHub provided,” he said. “Prior to that, it was a lot of work for a committer to be able to take a patch from someone else and merge it into the codebase.” The start of “open source” began around 1998, when OSI’s Phipps says the company Netscape came along with plans to release its browser code under a free software license. Instead of going for the GPL, the company created a

new license, which became known as the Mozilla project license. “It became obvious that there was a big slice of this software freedom movement that was unrepresented. Tied up with that was a difficulty in talking about it because the words the movement used to talk about it up to that point were confusing. When you hear the world free, you assume it doesn’t cost anything,” he said. So, in 1998, a group of people got together and decided to reframe the software freedom movement in a way that would allow people to quickly understand what it was about, and would allow businesses to embrace it without needing to engage in a complicated debate about ethics, Phipps explained. Out of that, came the deci-

20th Anniversary of Open Source

sion to use the term open source. “The introduction of the term ‘opensource software’ was a deliberate effort to make this field of endeavor more understandable to newcomers and to business, which was viewed as necessary to its spread to a broader community of users,” Christine Peterson, who is known for coining the term open source, wrote in a February blog post retelling the story. According to OSI’s Phipps, the term open source had already been commonly used in the industry at that point, but really took off when Peterson and Todd Anderson began using the term at a meeting at VA Research. Weeks later, the term was picked up by Tim O’Reilly, who renamed his Freeware Summit to Open Source Summit, and was also started to continued on page 22 >


020-28_SDT013.qxp_Layout 1 6/22/18 4:59 PM Page 22

22

SD Times

July 2018

www.sdtimes.com

The evolution of Open Source Sept 27, 1983 Richard Stallman announces the GNU Project for creation of free Unix

198 87 Thee GCC (GNU Compiler Collection) and d the Perl programming language, writtten by Larry Wall, are introduced

Oct 4, 1985 Stallman creates The Free Software Foundation to promote the use of free (as in speech) software

1991 Guido van Rossum releases the Python programming language age

1996 The first version of the Apache web server, created by Robert McCool, is made public

Aug 25, 1991 Linus Torvalds releases the Linux kernel

< continued from page 21

be used by Netscape. “For the name to succeed, it was necessary, or at least highly desirable, that Tim O’Reilly agree and actively use it in his many projects on behalf of the community. Also helpful would be use of the term in the upcoming official release of the Netscape Navigator code. By late February, both O’Reilly & Associates and Netscape had started to use the term,” Peterson wrote. As the months went by, open source’s popularly only continued to grow to a world where we can’t imagine not using the term — or the code. “A quick Google search indicates that ‘open source’ appears more often than ‘free software,’ but there still is substantial use of the free software term, which remains useful and should be included when communicating with audiences who prefer it,” Peterson wrote. After the term was coined, the industry felt there needed to be an organization put in place that would act as a steward of the term, and thus the Open Source Initiative was formed. “The

Aug 15, 1997 Miguel de Icaza and Federico Mena start work on the GNOME project, a desktop environment for Linux.

May 27 7, 1997 Eric S. R Raymond presents his sem minal essay, “The Caathedral and the Bazaar,” at a Linux Kongress in Germany. Two T wo years yeaars later, it would be published as a book

1995 PhP and Ruby programming languages created

Open Source Initiative (OSI) is a nonprofit corporation with global scope formed to educate about and advocate for the benefits of open source and to build bridges among different constituencies in the open source community,” the OSI wrote on its website. “Open source enables a development

1998 The Linux World Conference and Expo debuts in San Jose, California

Stallman was one of the first people who started the free software movement with the GNU project in 1983, which resulted in the GNU public license.

For many people who became programmers after the ‘90s, working publicly, pulling down open-source code, sharing improvements back ... became the default. It was what was expected. —Brian Behlendorf

method for software that harnesses the power of distributed peer review and transparency of process. The promise of open source is higher quality, better reliability, greater flexibility, lower cost, and an end to predatory vendor lock-in.”

Open-source software vs free software According to OSI’s Phipps, the history of software freedom dates back a lot further than 20 years ago, and Richard

“Without the genius insight of Richard Stallman (aka RMS) that existing copyright and licensing mechanisms could be leveraged to enable the distribution and sharing of software—freely and openly—none of us would be here talking about this today. There would have been no open source software history without the work of GNU, FSF,


020-28_SDT013.qxp_Layout 1 6/22/18 5:04 PM Page 23

The

20th Anniversary of Open Source

Feb 3, 19 99 Christine P Peterson coins the term open o source at a strategy meeting at V VA A Linu Linux ux

Apr 14, 1998 8 Tim O’Reilly’s Freeware Summit becomes know wn ource as the Open So Summit

Feb 5, 1998 The Open n Source Initiative (OSI) is formed, and the Open Source Definition, derived from Bruce Perrens’ work at Debian/GNU Linux, is adopted

2001 IBM creates the Eclipse Project, which would become the Eclipse Foundation

2000 The “Lo ow-Level Virtual Machine”(LL LV VM) compiler toolkit p project is started at the U University of Illino ois

GNOME, and others. All of software development owes them a huge debt,” said OSI’s Brasseur. Today, Stallman is the president of the Free Software Foundation, and while he is a significant figure in this space, Stallman does not agree with the term open-source software. According to Stallman, open source is a term that was adopted in 1998 by people who reject the free software movement’s philosophy. “When we call software ‘free,’ we mean that it respects the users’ essential freedoms: the freedom to run it, to study and change it, and to redistribute copies with or without changes. This is a matter of freedom, not price, so think of ‘free speech,’ not ‘free beer,’” Stallman wrote in a post. He explained that the term open source “misses the point.” While open source was created as a marketing campaign for free software, along the way the meaning has transformed. “The term ‘open source’ quickly became associated with ideas and arguments based only on practical values, such as making or having powerful, reli-

2005 Torvalds creates the Git code repository

2003 Phoenixx, an experimental branch of o the Mozilla Project, is comp pleted and renamed Firefox, a popular web bro owser

Sep pt 10, 2009 Micrrosoft, once seen as the m mortal enemy of open source, form ms the CodePlex Foundation to further urther open source. In 2016, 016, Microsoft would join the L Linux Foundation

2008 Google release the An ndroid operating system m for mobile phones and taablets

And today, these projects stand out Today, open-source projects are prevalent, and are leading innovation in software development and delivery. Among the most widely used projects today are:

AngularJS

Istio

npm

TensorFlow

Ansible

Kubernetes

React

VS Code

Docker

Node.js

Spring

able software. Most of the supporters of open source have come to it since then, and they make the same association,” he wrote. “The two terms describe almost the same category of software, but they stand for views based on fundamentally different values. Open source is a development methodology; free software is a social movement. For the free software movement, free software is an ethical imperative, essential respect for the users’ freedom. By contrast, the philosophy of open source considers issues in terms of how to make software ‘better’— in a practical sense only. It says that nonfree software is an inferior solution to the practical problem at hand.” OSI’s Phipps believes open-source software and free software have the same meaning or purpose, they are just

articulated in different ways according to the preference of the organization and the people who are articulating it. Phipps explained that the term open source was only created as a marketing program for free software. “A discussion about a philosophy doesn’t often get very far with a business, so people who are talking about open source tend to lead with the benefits of having the freedoms,” he said. But Stallman disagrees. According to Stallman, the term open source was meant to remove the ethical language because it made businesses and people “uneasy.” “When open source proponents talk about anything deeper than that, it is usually the idea of making a ‘gift’ of continued on page 24 >


020-28_SDT013.qxp_Layout 1 6/22/18 5:00 PM Page 24

24

SD Times

July 2018

www.sdtimes.com

< continued from page 23

source code to humanity. Presenting this as a special good deed, beyond what is morally required, presumes that distributing proprietary software without source code is morally legitimate,” he wrote. “The philosophy of open source, with its purely practical values, impedes understanding of the deeper ideas of free software; it brings many people into our community, but does not teach them to defend it. That is good, as far as it goes, but it is not enough to make freedom secure. Attracting users to free software takes them just part of the way to becoming defenders of their own freedom.”

OSI believes open source means more than just access to source code. In order to be considered as open source, it must comply with 10 criterias: • Free redistribution • Source code • Derived works • Integrity of the author’s source code • No discrimination against persons or groups • No discrimination against fields of endeavor • Distribution of license • License must not be specific to a product • License must not restrict other software • License must be technology-neutral

The tipping point for enterprise adoption The coining of the term open source was meant to pave the path for businesses to adopt free and open-source software, but businesses didn’t travel there overnight. In 2001, Steve Ballmer, CEO of Microsoft at the time, described the open-source operating system Linux as a cancer. “Linux is a cancer that attaches itself in an intellectual property sense to everything it touches,” he said in 2001. Of course, since then Ballmer has changed his stance now that he no longer sees Linux as a threat. Microsoft has also embraced open source, and is now one of its biggest contributors. The

A matter of trust and concern One of the biggest concerns in the open-source world is tunities to do so. It’s easy to see that maintainers are eager to whether or not users can trust the security of the project. Over make their projects more secure and that users want to make the years, the dedication of developers, companies and organi- security a priority in their open-source consumption. It’s just a zations like the Core Infrastructure Initiative looking to support matter of ironing out the wrinkles a bit,” the report stated. projects has dissipated this fear, but concerns still remain. Some tips on maintaining and improving the security of openGitHub’s 2017 Open Source Survey revealed 86 percent of source code include having a public-facing disclosure policy, runrespondents find security a top concern. ning regular audits and security checks, and making it clear to While public open-source typically implies there are more eyes users that you care about security. If you use open-source code, looking at the code, it doesn’t imply that it is Snyk says you should check for any known vulnerabilities in thirdmore secure, according to Guy Podjarny, party components, conin the number of open source application library CEO of Snyk. The open-source projects tribute back to the comsecurity vulnerabilities published in 2016 backed by foundations or corporations increase munity, and report tend to be very good from a security vulnerabilities as responsible as possible. perspective, but there are still small“Security is a complicated beast, and developers are Open Source er open-source projects out there not security experts. Open source is a very high-scale backed by individuals or a group of problem. We need the tooling and ecosystem to better Vulnerabilities developers who aren’t as wellcater to the open-source community,” said Podjarny. Published equipped to maintain security. In addition, Brian Fox, CTO of Sonatype, says users Per Year In Snyk’s 2017 State of Open need to be aware that the bar to producing open-source Source Security report, the compasoftware has been significantly reduce as well as the bar ny found open-source library vulnerof people paying attention. The next generation of pubabilities increased by 53.8 percent in lishers need to think about how their choices are going 2016, the mean time from disclosure Source: Snyk, 2017 State of Open Source Security report to potentially impact consumers. “There is a magnifying to a fix being released is 16 days, 79.5 percent of maintainers effect that you can write something and people can use it, and that don’t have a public-facing disclosure policy in place, and of is awesome, but what is not awesome is if you write something and 433,000 sites tested, 77 percent have at least one client-side everyone gets hacked or in extreme cases people potentially die JavaScript library with a known security vulnerability. “The open because systems crash because of a careless developer.” source landscape is massive and only getting more diverse. The “Most of them aren’t setting out to do it on purpose, but they overall security of open source is an important measuring stick. are equally not thinking about the unintended consequences of We need to know where we stand today to know what we can do the things they do,” he added. better,” the report stated. Fox suggests developers think about their choices as if they Forty-three percent of Snyk respondents stated they never are protecting millions of people, because as a producer of audit their code, and 75 percent of vulnerabilities are not dis- open-source software that is essentially what is happening. covered by the maintainer. “Generally, it is good that we are able to share and reuse According to the report, the lifecycle of open-source security work without having to keep solving the same problem over and should include: discovering vulnerabilities, releasing fixes, notify- over again. I think we just have to get more mature in how we ing users and adopting published fixes. “It’s clear we have some manage it. Then we will be able to really recognize the benefits room for improvement, but it’s also clear we have a lot of oppor- of it without unmitigated downside,” he said. z

53.8%


020-28_SDT013.qxp_Layout 1 6/22/18 5:00 PM Page 25

The company recently announced the acquisition of GitHub for US$7.5 billion. “Open source has gone from something that was almost anti-company to something that really got embraced by businesses. Open source really set the standard for how you can more effectively work together, and companies are now embracing that way of working as well,” said Sid Sijbrandij, CEO of GitLab. “Overtime, all the concerns with licenses and how to work together with the community got better, and as a result it got more popular.” In spite of that, OSI’s Brasseur thinks most businesses still haven’t realized the importance of open source. “I’ve seen companies shut down their open-source programs. I’ve seen companies swear they don’t use any open-source software and then seen the stunned looks on their faces when they’re shown how much of their stack is free and open-source software. I’ve seen companies release faux open source, either by throwing unlicensed and unsupported projects over the wall and calling them ‘open source’ or by

releasing them under proprietary licenses and claiming they’re ‘open source’ when at best they may be ‘source available,’ " she said. She does admit that there has been an explosion of corporate involvement, contribution and sponsorship. For Hyperledger’s Behlendorf, the tipping point for businesses to realize the benefits was in the late ‘90s when the Netcraft web server service conducted a survey asking about the web servers businesses were running. “That survey was this compelling visual indication of the prominence of the Apache Web Server,” he said. “It is still the main web server running on the majority of active websites on the Internet today.” Behlendorf explained that this was the first time the non-technical audience could visualize that there was something important happening here. In addition, Behlendorf said he was approached by IBM in the late ‘90s. IBM said it recognized something was happening in the Apache world, and wanted to be a part of it. This interest was borne

20th Anniversary of Open Source

Top contributors to the Linux kernel In 2017, the Linux Foundation released a report that detailed who contributed most to the Linux kernel since the 2.6.11 release (which was the beginning of the git era) to the 4.13 release. Here are the top 10 contributors:

CONTRIBUTOR

# OF CHANGES

H Hartley Sweete

6,034

Al Viro

5,904

Takashi Iwai

5,089

Mauro Carvalho Chehab

5,039

David S. Miller

4,044

Johannes Berg

4,014

Mark Brown

3,978

Tejun Heo

3,951

Russell King

3,692

Greg Kroah-Hartman

3,593

—Jenna Sargent

continued on page 27 >

683(5 )$67 $1' $'9$1&(' &+$576

/LJKWQLQJ&KDUW y y y y

:3) DQG :LQ)RUPV 2SWLPL]HG IRU UHDO WLPH GDWD PRQLWRULQJ 5HDO WLPH VFUROOLQJ XS WR ELOOLRQ SRLQWV LQ ' +XQGUHGV RI H[DPSOHV

y y y y

2Q OLQH DQG RII OLQH PDSV 7RXFK VFUHHQ IHDWXUHV $GYDQFHG 3RODU DQG 6PLWK FKDUWV 2XWVWDQGLQJ FXVWRPHU VXSSRUW

Ϯ ĐŚĂƌƚƐ Ͳ ϯ ĐŚĂƌƚƐ Ͳ DĂƉƐ Ͳ sŽůƵŵĞ ƌĞŶĚĞƌŝŶŐ Ͳ 'ĂƵŐĞƐ

ZZZ /LJKWQLQJ&KDUW FRP GR

&Z dZ/ >


020-28_SDT013.qxp_Layout 1 6/25/18 11:00 AM Page 26

26

SD Times

July 2018

www.sdtimes.com

When corporate swoops in BY CHRISTINA CARDOZA

The enterprise is not only embracing open source, but it is becoming more involved in the community today. Every day you see companies either releasing open-source software, contributing to a project, or even taking over a project. But you also see companies acquiring or starting to work on a project to benefit their own solutions and vision. For instance, Microsoft recently announced it acquired GitHub, the webbased hosting service that is home to many open-source software projects. While GitHub itelf isn’t a opensource software project, there are fears that come along when a corporate company buys an open-source related company. Some fear that “they merge that into an existing solution from them, caring more about what their corporate customers want, rather than the open source community in general,” said Martin Gontovnikas, vice president of marketing and growth at Auth0. Gontovnikas says as long as Microsoft keeps GitHub separate, there should not be anything to worry about. “Microsoft’s decision to keep Github separate with Xamarin’s CEO as the new CEO is a very smart idea. They’re showing from the get-go that they won’t ‘corporatize’ Github, or at least it doesn’t seem so,” he added. Jim Zemlin, executive director at the Linux Foundation, believes Microsoft’s acquisition of GitHub is good news for the open-source world, and should be celebrated. “Should the open source community be concerned? Probably not. Buying GitHub does not mean Microsoft has engaged in some sinister plot to ‘own’ the more than 70 million open-source projects on GitHub. Most of the important projects on GitHub are licensed under an open-source license, which addresses intellectual property ownership. The trademark and other IP assets are often owned by a non-profit like the Linux Foundation... And let’s be quite clear — the hearts and minds of

developers are not something one ‘buys’ — they are something one ‘earns,’ ” Zemlin wrote in a post. OSI’s president Simon Phipps believes the fear developers have of corporate takeover of open-source projects is more of a conspiracy theory. “Obviously a company is not going to harm themselves. There are benefits that accrue to them, and that’s shared maintenance and external innovation,” he said. When a corporate company

steps into an open-source project, it is important that it remains transparent about future plans and goals, according to Mik Kersten, CEO and co-founder of Tasktop. “In my experience the effect of a corporate entity taking over an open-source project depends most on the business model of an entity. If there is a high degree of alignment with continuing the project and supporting its community, it can work. Typically this means the business can monetize the open-source user base either directly or indirectly. If there is a lack of alignment, the project can quickly see staff cut from it and start to decline. I’ve lived examples of both, and the best suggestion I have is to be clear to the community about where the project is headed and why,” Kersten said. It’s not just being transparent about the code, Sid Sijbrandij, CEO of GitLab, added. It is about being transparent about the decision process, the roadmap, what is going well, and what isn’t going well. Tasktop’s Kersten created the opensource Eclipse Mylyn project, a task and

ALM framework for Eclipse. According to Kersten, one of the reasons he made Mylyn open source was to make it easier to get contributions from the community. Kersten decided to move Mylyn to the Eclipse Foundation because he believed it could help him have a bigger impact on developer productivity. According to Kersten, when corporate takes over, developers need to look at how the open-source project is going to be structured going forward, what the governance model for the project looks like, and the licenses and contributor license agreement. “Once a company or even an individual gets more serious about embracing an opensource project, it becomes very important to ask questions because it will determine what happens or give you a sense for what will happen with a course of the project in the future,” he said. “For me, I try to distinguish whether the project is open source, which means the source code is available, or it is actually openly developed, which means it is very easy for an individual or company to start contributing to the project.” While Brian Behlendorf, executive director for the blockchain consortium Hyperledger, hasn’t seen any examples of a corporate company taking over a project and trying to force developers to do things, there are times where a company comes in and ends up prioritizing the kinds of things they find important. “The bigger risk is generally more of where you have an opensource project. If a majority of the developers work for a particular company, than it is harder to ensure that the door is open,” he said. “Ideally, what you are trying to do is get this perpetual motion machine going and tell the world that this is a community project.” The harder question to answer is whether or not a project will outlast a developer or company. Efforts like the Linux Foundation and Apache Software Foundation aim to help projects exist and grow into multi-stakeholder projects, Behlendorf explained. z


020-28_SDT013.qxp_Layout 1 6/22/18 5:01 PM Page 27

The < continued from page 25

from a survey IBM reportedly conducted amongst their Fortune 100 customers. The company asked the CIOs of each of those companies how many were using Linux or other open-source software in their infrastructure. While only a handful of CIOs reported they were using it, when the company repeated the same question to technical managers and system admins at a lower level who worked closer to the code, a majority reported using Linux or open-source technologies. “It was a indicator that there was commercial opportunity, not just interesting curiosity,” Behlendorf said. GitLab’s Sijbrandij agreed with Behlendorf that IBM’s embrace of the Apache Web Server was momentous at the time. “It was one of the most respected brands, and they were adopting this open-source project,” he said. Sijbrandij also gives Oracle’s acquisition of MySQL and Google’s release of Kubernetes credit to bringing the

open-source movement to enterprises. Sonatype’s Fox believes the tipping point happened when build systems made it possible to consume open source. However OSI’s Phipps says the transition for enterprise adoption has been more gradient rather than a tipping point. “I think people gradually understood the value of using software where many people are collaborating around it.” he said. “What tends to swing things for people is when they realize that open source isn’t about giving everything away, but actually it is an alternative way of investing your intellectual property.”

The state of open source Today, you would be hard-pressed to find a solution that doesn’t contain some form of open-source software. Forrester and Gartner have reported 80 to 90 percent of commercial software developers use open-source components within their solutions.

20th Anniversary of Open Source

“Nowadays, if you started a project, it would be unthinkable for you to decide you were going to build everything from the ground up and start with that massive investment,” said OSI’s Phipps. Open source is everywhere, Behlendorf stated. It is about empowerment and giving the community the tools to create economic value. “Proprietary code still has a place, but it now has to justify its existence rather than the other way around,” he said. Closed-source software is starting to become frustrating, according to GitLab’s Sijbrandij. If you find a bug in closed-source software, you can’t solve it. You don’t have a way to access it. “It is like driving a car where you can’t open the hood. That is super frustrating because if you want to fix a bug or add

tially, becoming the first $1 billion open-source company. It expanded its business into numerous new sectors including enterprise storage, container management, middleware, cloud computing, training, and much more. Over the years Red Hat has acquired numerous other companies — once a company is acquired by Red Hat they open-source its technology —to allow it to expand its product offerings and further build out open source software solutions.” Today, the company is led by CEO Jim Whitehurst, who took the reins in 2007. In 2009, Whitehurst came up with the company’s mission statement: “To be the catalyst in communities of customers, contributors, and partners creating better technology the open source way.” According to Red Hat, the open source way is a way of thinking. It is about having the freedom to see the code, to learn from it, to ask questions and offer improvements. The open source way applies to principles that transcend enterprise IT, such as culture, transparency, adaptability, and collaboration. “The open source way is changing the world in the same way the open source model has changed software,” said Walker. “The characteristics found in the communities of open-source coding define our work culture and perspective.” “Instead of being seen as an alternative choice or a costsaving option, open source is now the new normal in enterprise technology,” Walker added. “It is where innovation is happening and its enabling organizations to take a more agile approach to digital transformation. Beyond that, proprietary software is now being seen as a ‘lock-in’ for organizations and it being avoided or discarded wherever possible.” z

Red Hat’s “open-source way” BY MATT SANTAMARIA

Since its founding in 1993, open source has been in Red Hat’s DNA. From offering one of the earliest Linux operating system distributions to creating the Fedora project for the development of free and open source software, to managing the opensource.com website, Red Hat has been synonymous with an open and collaborative culture. According to Red Hat, “open collaboration actually removes challenges that many other enterprises face by eliminating bottlenecks and ensuring a free flow of information.” Red Hat’s origins date to 1994, when Marc Ewing released his own distribution of Linux called Red Hat Linux. Gaining the interest of businessman Bob Young, the two teamed up to create Red Hat Software. Legend has it that Ewing chose the name Red Hat because of a Cornell University lacrosse hat he was given. Cornell’s sports teams are known as ‘Big Red.’ In those early years, the company was able to experiment, innovate, and perfect a community-based development model. “Red Hat gained lots of experience participating in communities, adding features and functionality desired by customers, and then testing, hardening, compiling, and distributing stable, workable versions to customers,” the company wrote. “It was during this time that Red Hat emerged as the open-source leader, a role it still enjoys today.” In 2000, open-source evangelist Matthew Szulik become the next Red Hat CEO. The company released its flagship Red Hat Enterprise Linux in 2002. “[Open Source] was a seller of the Red Hat Enterprise Linux distribution,” said Mike Walker, global director of Red Hat. “Since then, Red Hat has grown exponen-

continued on page 28 >


020-28_SDT013.qxp_Layout 1 6/22/18 5:01 PM Page 28

28

SD Times

July 2018

www.sdtimes.com

< continued from page 27

new wiper fluid, and you can’t if it is closed. No one would accept a car like that, and developers are starting to reject software that is made like that,” he said. What we are seeing now is that collaborative innovation is working, said OSI’s Phipps. “We have seen Google and Facebook bring themselves into existence with layers of open source. We have see millions of startups being able to get going because they are able to install Linux, run Apache Server, run Apache Tomcat and use open source to their advantage.” Now that it is the default, the next question to ask is where is it going. “We have all the freedom we need in place, so now we have the luxury of being able to ask meta questions about governance, continuous improvement, safety and so on,” said Phipps. Despite all the progress the opensource space has made, OSI’s Brasseur explained there is still plenty of room to grow.. While the approach of “by programmers for programmers” has gotten open-source software this far, to keep the momentum going, Brasseur said we need to shift to the idea of “by programmers for others.” “The usability, accessibility, and documentation of most FOSS projects are in such a state as to be entirely out of reach of people who don’t spend their lives steeped in technology and software development. We’re not going to further the missions of free and open source software if we can’t start developing software that reaches out to a new market of users and, most importantly, meets them where they are rather than expecting them to read code, edit config files, or open up a terminal just to perform basic tasks,” she said. Brasseur says we are moving to FOSS v3.0, where free and open source has evolved into “Business As Usual.” Version 2.0 was the launch of the opensource definition, and version 1.0 was the dawn of free software, she explained. As we move towards version 3.0, we have to be mindful of how open-source software communities are going to work with corporations, and not lose sight of the open source’s mission and bigger picture, Brasseur explained. z

The Apache License allows open source to thrive Open source is celebrating its 20th anniversary, and so is the Apache License. The Apache License is a permissive free software license that is currently in its third iteration. The license allows customers to use intellectual property for any purpose, such as modifying or distributing it. According to Roman Shaposhnik, member of the Apache Software Foundation board of directors, the license was created from a combination of business interests and a desire of the Apache Group (which later became the Apache Software Foundation) to ensure that the community around Apache httpd web server grew. That Apache web server was actually the first project to be licensed under the Apache License, Shaposhnik said. “These licenses help us achieve our goal of providing reliable and long-lived software products through collaborative open source software development. In all cases, contributors retain full rights to use their original contributions for any other purpose outside of Apache while providing the ASF and its projects the right to distribute and build upon their work within Apache,” the Apache Software Foundation wrote on their website. The ASF maintains stewardship over Roman Shaposhnik, the license. Currently, all Apache SoftASF Board Member ware Foundation projects are required to be under the Apache License, said Shaposhnik. “Apache License had a huge influence on legitimizing Open Source within the enterprise and business communities,” said Shaposhnik. “While initially GPL made sure that Open Source can survive, it wouldn't be a stretch to say that Apache License made Open Source thrive.” The Apache License has gone through three different versions since its creation. Version 1.1 was approved by ASF in 2000, with the primary change from 1.0 being the addition of the advertising clause. The advertising clause specified that derived products were no longer required to include attribution in advertising materials, just in their documentation. The current version, 2.0, was approved in 2004, with several new goals in mind. With this version, the ASF wanted to reduce the number of frequently asked questions surrounding the license, allow it to be reusable without modification by any project, allow it to be included by reference instead of having to be listed in every file, clarify the submission of contributions, require a patent license on contributions that infringe on the contributor’s own patents, and move comments regarding Apache and other inherited attribution notices to an outside “NOTICE” file, Shaposhnik said. These changes resulted in the license that is used today, one that is compatible with other open source licenses while still remaining true to the initial goals of the ASF. It also resulted in a license that is supportive of collaborative development across nonprofit and commercial organizations. The current version has been in place for 14 years, and while there may be business and technology reasons to change it in the future, Shaposhnik compares it to the United States Constitution in that a lot of interests depend on how clearly it can be interpreted and understood. “With the current version of Apache License we have 14 years worth of foundation to be sure that we understand it pretty well,” Shaposhnik explained. “Any tweak to the license will have to withstand a comparable amount of test of time.”

—Jenna Sargent


Full PageAds_SDT013.qxp_Layout 1 6/22/18 12:48 PM Page 29








Full PageAds_SDT013.qxp_Layout 1 6/25/18 10:03 AM Page 36


037-42_SDT013.qxp_Layout 1 6/22/18 4:39 PM Page 37

www.sdtimes.com

July 2018

SD Times

Buyers Guide

Application security needs to shift left BY JENNA SARGENT

A

s teams are pressured to release software more rapidly, more and more aspects of software development are being forced to “shift left,” moving up earlier in the development lifecycle. Because of the speed in which code is updated and delivered, security can no longer be thought of as an afterthought, said Rani Osnat, VP of product marketing at Aqua Security, a company that specializes in container security. “That’s why we profess to shift left security and basically embed it as early as possible in the development process so that developers can do a lot of the work in advance as they deliver the applications and not expect to throw it over the fence and have someone else take care of it.” Operations teams can no longer accept an application as is and plan on securing it once it is deployed in the runtime environment, Osnat said. Application security used to act as governance and as a gate that security teams applied to evaluate the security of software before it was deployed. “I think as trends like agility or trends like con-

tinuous delivery or DevOps come into play, that role as a point-in-time gate and as a governance function is being questioned,” John Steven, senior director of software security at Synopsys, an application security company, explained. He added that when teams go to implement security, they often search through regulations or information on the web to look for what they should care about. “I think organizations are struggling to figure out what’s the difference between what the web tells me I should look for in terms of security problem and what would impact my business in terms of risk,” said Steven. “And so they’re struggling to figure out what they need to pay attention to.” They question how attackers will explore their organization and attack its assets and how that is different from what they paid attention to in the past. They also question how they will adapt the sensors that are already in place to look for vulnerabilities, Steven explained. Though many organizations have already adopted DevOps, one trend is now DevSecOps, which adds a security team in addition to the development and operations teams. Osnat believes that security teams

should be responsible for creating and enforcing security policies and determining what is an acceptable level of risk. The implementation of those policies, however, should be handled jointly by security and development teams. According to Osnat, there is a shortage of cybersecurity professionals, and that shortage is not getting any smaller. According to a survey published by the National Institute of Standards and Technology this June, there are 301,000 open cybersecurity jobs throughout the United States. A report from Cybersecurity Ventures predicts that the number of openings will rise to 3.5 million by 2021. “On the other hand, there are many more developers in the world,” said Osnat. “If you look at it as a global issue, basically what’s happening is that developers are developing more applications faster and delivering code faster than security can catch up to. That’s something where really the only way to address it is not to just give more work to security, but to move some of the burden to the developers in using best practices to secure applications when they are developed.” continued on page 39 >

37


Full PageAds_SDT013.qxp_Layout 1 6/22/18 12:51 PM Page 38

NEXT-GEN AUTOMATED APPSEC TOOL THAT DOES NOT SLOW YOU DOWN

EVERYTHING WINNING TEAMS NEED:

AUTOMATED

INTELLIGENT

INTEGRATABLE

Reduces the attack surface while keeping you in full control

Presents you with a list of solutions to review instead of a list of problems to resolve

Seamlessly integrates into your workflow

Find out more at www.mycode.ai


037-42_SDT013.qxp_Layout 1 6/22/18 4:39 PM Page 39

www.sdtimes.com

< continued from page 37

The shortage of cybersecurity professionals can also be addressed by incorporating artificial intelligence into DevOps and security workflows. “A shortage of skilled security professionals on both sides (AppSec and CyberSec), and their relatively high cost will drive an adoption of intelligent automation powered by AI systems and Quantum computing.”

Shifting culture as well There is also the issue that shifting testing left requires a huge cultural change within the organization. “Cultural imperatives are very hard for organizations to adopt because organizations reject culture change like viruses,” said Synopsys’ Steven. Even though the spirit of DevOps involves breaking down the silos between developers and operations, that does not always happen, explained Steven. Often, organizations will hire a DevOps engineer, typically reporting up to operations. “They’ve taken this cultural imperative to break down the walls, and they’ve turned it into a role in one of the silos, which is of course a perversion of the intent.” “I would hate for DevOps just to become a set of tools that a security group or operations group buys to engage developers more effectively, but they all stay in their silo,” Steven continued. Steven explained that the companies that have successfully scaled up well and handled performance well, those are the companies that effectively broke down those silos. Those organizations made security everyone’s job and the security team acted as a coach on the sidelines, while also enabling visibility into what was going well, what was going poorly, and where more time needed to be spent, he said. When organizations aren’t able to break down those silos and let developers handle security, it may be a result of organizations not planning out their goals correctly from the top of the organization down to the individual teams, explained Pete Chestna, director of developer engagement at CA Veracode, a provider of an automated end-

to-end service that simplifies application security. Companies should look at their goals and whether or not the development teams are accountable for what they build. If they’re not, that’s an area that needs to be addressed within the organization. When development teams have the option, they may push the responsibility onto some other group. “Once that becomes a non-option then they start to make that change real,” said Chestna. “There’s a lot of automation that you

Analyze, attack, continually test According to Arkadiy Miteiko, an AI-based security platform, the top performers in the industry typically have three things implemented in their security workflows:

1.

They injected code analysis tools into the development process and enforced fixes prior to deployment,

2.

They automated attacks against pre-production code and prevent that code from reaching production if attacks are successful, and

3.

They continually test the production environment for weaknesses in an automated fashion.

can do, which again is absolutely mandatory in these environments because of the speed in which code moves in the pipeline,” said Osnat of Aqua Security. “It is just not manageable with purely manual control.”

A role for artificial intelligence Introducing AI into the equation can solve some of the issues here. “Generally speaking, AI is extremely good at recognizing patterns and making statistical predictions based on its pattern recognition,” said Arkadiy Miteiko, co-founder and CEO of CodeAI. “Noise is a recognizable pattern. Once it has been recognized it can be filtered out. The quality and security issues that we are dealing with in code today are the same coding errors we fixed years ago.” Shifting the burden to developers seems like the ideal solution, but often the developers’ education did not prop-

July 2018

SD Times

erly prepare them to code securely. “It’s a muscle that development organizations don’t have,” CA Veracode’s Chestna explained. “If you allow developers to continue to code incorrectly and then correct them later, you’re not really helping them be better,” said Chestna. “DevOps is all about continuous improvement. So we need to take the knowledge of what they struggle with and we feed that back to them in the form of training and then measure whether or not that training was effective, and they would get better in that process.” According to Chestna, the idea of coding securely can be taught, it is just a matter of whether organizations will put pressure on universities to change their curriculum. “They’re not going to do that until we change the requirements,” he said. “So until you start to say that this is something that I want to hire, and I want your university to support this — that’s something that’s not going to happen, but that’s really the shift left that I want to see.” Looking towards the future, many experts agree that there is still much to be done. “I think the fact is it is growing,” said Osnat. “I think the first generation of solutions that were out there were very much tied to specific programming languages and specific environments. I think as we move into cloud-native applications a lot of these things start to go away because they are created to run in different environments, to be a lot more flexible.” Osnat also believes that we are not very far away from a day where a lot of companies that provide development platforms will embed security tools in those platforms. “If, in the next five years, vendors are able to provide the industry with tools that have the capabilities required to win this security game we’ll begin to see drastic improvements in the overall security posture,” said Mitieko of CodeAI. In the future, the burden will not just fall to the developers and security teams. Software vendors will be expected to integrate security into their tooling as well. z

39


037-42_SDT013.qxp_Layout 1 6/22/18 3:45 PM Page 40

40

SD Times

July 2018

www.sdtimes.com

How these companies can help make your applications more secure Dror Davidoff, co-founder and CEO of Aqua Security Aqua Security enables enterprises to secure their container-based and cloudnative applications from development to production, accelerating container adoption and bridging the gap between DevOps and IT security. Images serve as a container’s foundation, and developers can easily pull them from a centralized registry to run containers in a highly automated, flexible process. From a security and governance perspective, trusting the container image becomes a top priority. At the same time, runtime environments with a new stack that includes container runtime engines, orchestration platforms such as Kubernetes, and cloud-native network overlays, present a challenge in providing visibility and control over containerized applications. The Aqua Container Security Platform delivers the most comprehensive solution for securing containerized environments, supporting a broad range of platforms, for “on-prem” deployment as well as AWS, Google, and Azure cloud deployments. Aqua’s solution provides full lifecycle security for containers, hardening the technology and implementing tight, enforceable governance of the entire development process, with a special focus on runtime. As container adoption rates surge, and the infrastructure for cloud-native continues to evolve to include Container-as-aService (CaaS) and serverless approaches, Aqua is investing in supporting our customers’ DevSecOps initiatives on their platform of choice. For example, the recent release of v3.0 introduced native support for Kubernetes and a new MicroEnforcer model that enables security and monitoring in CaaS environments such as AWS Fargate and Microsoft ACI. Aqua integrates and automates strong, enforceable security controls into the application development lifecycle from the moment a container is created until it is decommissioned. By providing a comprehensive platform for securing containerized environments, Aqua enables

customers to extract all the cost, agility, and efficiency benefits that containers offer without increasing their risk profiles.

Arkadiy Miteiko, co-founder and CEO of CodeAI CodeAI is the only SAST solution currently available that does not slow you down. It is no myth that introduction of security coding requirements does slow DevOps down. CodeAI enables DevOps to maintain their speed as they work on hardening applications against known cybersecurity threats. It reduces noise in the tool chain (i.e. false positives) and generates actionable solutions for the issues found. Developers can spend more time coding new things and less time fixing old code. It amplifies security standards and enables developers to successfully meet them. CodeAI is available as a cloud-based service for open-source projects and can be deployed on-premises for commercial customers. It easily integrates with your SDLC toolchain and delivers value within a few weeks after deployment. It is ideal for the teams that are looking to buy performance, not just a product.

John Steven, senior director of software security at Synopsys In many ways, application security has always struggled to find its seat at a larger table during broader enterprise security or development conversations. Synopsys consulting services and developer-centric tools have always helped owners of application security initiatives find their seat at those larger tables. Today, in the face of movements like ‘DevOps’, we find that helping application security coach development in ways to ‘accelerate the delivery of software’ is crucial. It both credentials those maturing security initiatives and finds them that seat at the table. So, at Synopsys, we help client organizations modernize their software lifecycles, bringing the appropriate aspects of security in at every lifecycle phase. The

result is increased agility and more automated governance, as well as reduced barriers between traditional silos like Development, Operations, and Security Governance.

Pete Chestna, director of developer engagement at CA Veracode CA Veracode enables the secure development and deployment of the software that powers the application economy. This includes open-source technology and your own first-party developed code. With its combination of automation, process and speed, CA Veracode becomes a seamless part of the software life cycle, eliminating the friction that arises when security is detached from the development and deployment process. As a result, enterprises are able to eliminate vulnerabilities during the lowest cost-point in the development/deployment chain so they can fully realize the advantages of DevOps environments while ensuring secure code is synonymous with high-quality code. CA Veracode provides the three key criteria for fitting into today’s DevOps methodology. It is fast, provides an industry-best low false-positive rate and integrates into popular tools out of the box. CA Veracode can scan applications with a combination of static, dynamic and software composition analysis to provide a comprehensive view of risk prior to deployment. CA Veracode helps train development teams through a combination of ondemand eLearning, instructor-led training and guidance provided directly through our IDE integrations. Shifting left all the way to training allows you to bend the typical bug-fixing cost curve to zero. If you can train your developers to write it correctly the first time, or catch it as they write code, they actually code faster by avoiding costly rework. The CA Veracode services team helps you fix what you find. We have a dedicated team of security consultants that will work with your team to understand what was found, how to fix it and how to prevent it in the future. z


Full PageAds_SDT013.qxp_Layout 1 6/25/18 10:04 AM Page 41


037-42_SDT013.qxp_Layout 1 6/22/18 3:47 PM Page 42

42

SD Times

July 2018

www.sdtimes.com

A guide to DevSecOpsTools n Checkmarx provides application security at the speed of DevOps, enabling organizations to deliver secure software faster. n Chef Automate is a continuous delivery platform that provides actionable insights into the state of your compliance, configurations, with an auditable history of every change. n Contrast Access usesdeep security instrumentation to analyze code in real time from within the application. Contrast Protect provides actionable and timely application layer threat intelligence across the entire application portfolio. n CyberArk Conjur is a secrets management solution that secures and manages secrets throughout the DevOps pipeline to mitigate risk without impacting velocity. n Datical’s solutions make database code deployment as simple as application release automation while elimination risks that cause data security vulnerabilities. n IBM provides security testing with IBM AppScan or Application Security on Cloud. IBM helps you build your production safety net with application management, Netcool Operations Insight and IBM QRadar for security intelligence and events. n Imperva WAF protects against the most critical web application security risks: SQL injection, cross-site scripting, illegal resource access, remote file inclusion, and other OWASP Top 10 and Automated Top 20 threats. n JFrog Xray is a continuous security and universal artifact analysis tool, providing multilayer analysis of containers and software artifacts for vulnerabilities, license compliance, and quality assurance. n Nosprawl integrates with development platforms to check for security vulnerabilities and verify that software is secure before it gets into production. n Parasoft’s static analysis violation metadata includes likelihood of exploit, difficulty to exploit/remediate, and inherent risk, so you can focus on what’s most important in your C and C++ code. n Qualys helps businesses simplify security operations and automates the auditing, compliance, and protection for IT systems

n

FEATURED PROVIDERS n

n Aqua Security enables enterprises to secure their container and cloud-native applications from development to production, accelerating application deployment and bridging the gap between DevOps and IT security. The Aqua Container Security Platform protects applications running on-premises or in the cloud, across a broad range of platform technologies, orchestrators and cloud providers. Aqua secures the entire software development lifecycle, including image scanning for known vulnerabilities during the build process, image assurance to enforce policies for production code as it is deployed, and run-time controls for visibility into application activity, allowing organizations to mitigate threats and block attacks in real-time. n CA Veracode creates software that fuels modern transformation for companies across the globe. DevSecOps enables the build, test, security and rollout of software quickly and efficiently, providing software that’s more resistant to hacker attacks. Through automation, CA Technologies extends faster deployment with an agile back end that delivers more reliable releases of code helping teams to work collaboratively earlier in the DevSecOps process to detect security vulnerabilities in every phase, from design to deployment. n CodeAI is smart automated secure coding application for DevOps, that fixes security vulnerabilities in computer source code to prevent hacking. It's unique usercentric interface provides developers with a list of solutions to review instead of a list of problems to resolve. Teams that use CodeAI will experience a 30%-50% increase in overall development velocity. CodeAI takes a unique approach to finding bugs using a proprietary deep learning technology for code trained on real-world bugs and fixes in large amounts of software. CodeAI fixes bugs using simple program transformation schemas derived from bug fixing commits in open source software. n Synopsys helps development teams build secure, high-quality software, minimizing risks while maximizing speed and productivity. Synopsys, a recognized leader in application security, provides static analysis, software composition analysis, and dynamic analysis solutions that enable teams to quickly find and fix vulnerabilities and defects in proprietary code, open source components, and application behavior. With a combination of industry-leading tools, services, and expertise, only Synopsys helps organizations optimize security and quality in DevSecOps and throughout the software development lifecycle. and web applications. n Redgate Software’s SQL Data Privacy Suite provides a scalable and repeatable process for managing personally-identifiable information as it moves through your SQL Server estate. n Rogue Wave Software’s Klocwork static code analysis tool helps DevSecOps professionals create more secure code with onthe-fly security analysis. n Signal Sciences’ WAF and RASP help teams gain actionable insights, secure across the broadest attack classes, and scale to any infrastructure and volume elastically.

n Sonatype’s Nexus platform combines indepth intelligence with real-time remediation guidance to automate and scale open source governance across every stage of the modern DevOps pipeline. n Sumo Logic provides a secure, cloudnative, multi-tenant machine data analytics platform that delivers real-time, continuous intelligence across the application lifecycle and stack. Sumo Logic simplifies DevSecOps implementation at the code level. n WhiteHat Security’s Application Security Platform is a cloud service that allows organizations to bridge the gap between security and development to deliver secure applications at the speed of business. z


Full PageAds_SDT013.qxp_Layout 1 6/22/18 1:02 PM Page 43


044_SDT013.qxp_Layout 1 6/25/18 10:58 AM Page 44

44

SD Times

July 2018

www.sdtimes.com

Guest View BY PETE JOHNSON

The currency of software innovation Pete Johnson is a principal systems engineer for cloud, containers and serverless in the Global Partner Organization at Cisco Systems Inc.

I

n the modern economy, every business is a software business. Why? Because companies have figured out that the easiest way to introduce innovation into a marketplace is with software, whose defining characteristic is that, well, it’s soft. Instead of waiting months to make a change to some physical piece of equipment, software can be distributed quickly and widely to smartphone users, programmable manufacturing equipment, and a wide variety of other destinations. This is why you can Google just about any company of any size and find they have openings for software engineers. But how do you maximize the amount of innovation you can get out of them? It turns out, iterations are the currency of software innovation.

It’s all about the at-bats Venture capitalists are in the business of finding innovation and most of them will tell you that for every 10 companies they invest in, they are happy if one hits it big. Applying that same hit percentage to software development, companies have a 10% chance of any release containing an innovation that will stick with its intended audience, so is it better to have four chances at innovation a year with quarterly releases, 12 chances with monthly releases, or 52 chances with weekly releases? The strategic answer is obvious. More releases, more iterations of software, produce more chances at innovation. Tactically, though, how do you do that?

The strategic answer is obvious. More releases, more iterations of software, produce more chances at innovation.

From monoliths to microservices Virtual machines can be created in minutes and containers can be created in seconds, which changed the way that developers thought about application components. Instead of relying on inmemory or custom protocol communication, if each component had an HTTP-based API it could act as a contract between the components. As long as that contract didn’t change, the components could be released independent of one another. Further, if every component could sit behind its own load balancer it could also be scaled independently, in addition to taking advantage of rolling deployments where old instances of compo-

nents are removed from behind the load balancer as new ones are injected. These are the modern tenets of a microservices-based architecture, which are more loosely coupled thanks to those API contracts than their monolithic predecessors, enabling faster iterations.

Kubernetes is a big deal, and so is serverless But now if you have hundreds or thousands of containers to manage for all these microservices, you need a way to distribute them across different physical or virtual hosts, figure out naming and scheduling, and improve networking because different components might be on the same host, negating the need for packets to go out to the network card. This is why Kubernetes is such a big deal and why Google (through GKE), AWS (through EKS), and Cisco (through CCP), among others, are so bought into the container clustering platform. And again, it’s all in the name of iterations, so that development teams can more loosely couple their components and release them faster as a way of finding innovation. But what’s next? The big deal over serverless architectures is that they could be the next step in this evolution. Instead of coupling components via API contracts, serverless functions are tied together through event gateways. Instead of having multiple instances of a component sitting behind a load balancer, functions sit on disk until an event triggers them into action. This requires a far more stateless approach to building the logic inside the individual functions but is an even looser coupling than microservices with potentially better underlying physical server utilization, since the functions are at rest on disk until necessary.

The bottom line The bottom line is that the best way to find a good idea is to iterate through ideas quickly and discard the bad ones once you’ve tried them. This concept is driving application architecture, container clustering platforms, and serverless approaches in an attempt to remove as much friction from the software development and release processes as possible. The potential innovation gains from maximizing iterations are what just about every company is chasing these days, and it’s all because iterations are the currency of software innovation. z


045_SDT013.qxp_Layout 1 6/22/18 1:13 PM Page 45

www.sdtimes.com

July 2018

SD Times

Analyst View BY MICHAEL AZOFF

The next evolution in application life cycle management S

oftware lifecycle management (SLM) is the discipline for managing software development across its life cycle. I’ve been covering this space since 2003 when I started as an IT industry analyst and have witnessed its evolution into the vendor tool category of application life cycle management (ALM). It’s useful to ask what the difference is between SLM and ALM. How to do software engineering right is a perennial question on the minds of everyone in the software world, academic to commerce. The art of doing it right has evolved over many decades, with each generation adding to better practices and methodologies. We are in the aftermath of the Agile, lean, and DevOps revolutions/evolutions; thus, there is an increasing body of knowledge of useful ways to create software. This lasting knowledge is what SLM aims to capture. The commercial flip side is how vendors market tools to cover this need, and hence ALM is the market of tools that do SLM. The point being that ALM doesn’t necessarily offer the ideal SLM for hard commercial reasons. In an important respect that highlights the gap that existed, the next stage in ALM evolution brings us nearer to SLM: a true end-to-end management of the life cycle that embraces the most recent trends of agile, lean, and crucially, DevOps. The older generation of ALM tools principally catered to the needs of developers, spanning the concept, requirements, testing, coding, building, and some basic delivery automation. However, DevOps has introduced a whole new level of automation and sophistication (ease of use at scale) in a new generation of release management tools. This tool category evolved quite separate from ALM, but seen from an SLM viewpoint, the endto-end integration of tooling should have ALM include DevOps. ALM has gone through waves of adoption. In the world of enterprise IT, ALM has largely given way to essentially Agile life cycle management

(although the best of these tools have hybrid features for legacy projects to be managed in waterfall styles of work). Vendors market to the needs of their customers; some prefer loosely coupled tools, others prefer tightly integrated, highly centralized ALM suites. The open-source movement has had a huge impact in this space. The need for ALM extends also to advanced engineering and manufacturing, and with the digital transformation taking place across every industry, business products and services are becoming more

Michael Azoff is a principal analyst for Ovum’s IT infrastructure solutions group.

The next stage in ALM evolution brings us nearer to SLM: a true end-to-end management of the life cycle. software-centric. The need for ALM, and especially the tightly integrated, highly centralized variety, is greatest here, and represents the growth market for ALM. The use of ALM, especially with safetycritical product development, is paramount for meeting the needs of regulations and compliance. The next evolution of ALM is therefore an integrated suite that includes DevOps release management functionality. It must allow IT teams to trace requirements across test, build, and deployment, to which container, on which cloud or mobile product (phone, pacemaker, car etc.); to audit changes and control access and deployments; and to know which issue in the field traces back to an individual developer for a rapid fix and update. Some of these features may be unfashionable today in enterprise IT, but are necessary in engineering manufacturing, especially in safety-critical product development; this is where the next generation of ALM is growing. z

45


046_SDT013.qxp_Layout 1 6/22/18 12:32 PM Page 46

46

SD Times

July 2018

www.sdtimes.com

Industry Watch BY DAVID RUBINSTEIN

Open source has won the day David Rubinstein is editor-in-chief of SD Times.

I

n February 2000, I had been covering the technology space for all of 2½ months. As the newly minted executive editor of the nearly minted SD Times, I was sent by then editor-in-chief Alan Zeichick to cover the Linux World Conference and Expo. He said I might be interested in seeing a different side of development. I wasn’t sure what he meant. Since my hire in September 1999, I had already attended a Java Business Expo (lots of Sun news and a Penn and Teller show!) and the ebusiness conference and expo (think Lotus Notes and productivity). There were huge expo halls, booths filled with engineers having discussions about using the latest Java technologies that would modernize and transform business is done. (Sound familiar?) Then, I was assigned to the Linux World Expo. While there were vendors on hand (mostly offering different Linux distros), the feel was different. The first two events were clearly aimed at getting business done, and showing off the tools that would make that happen. Linux World felt like Thanksgiving with the family — everyone feels good about being together, but the bickering never stops! A big issue that day was fragmentation. Linus Torvalds was on hand to assure people that because of the open-source license, all modifications to Linux would have to be shared with the entire community, preventing — in theory — any one version from dominating. Kumbaya! But on the other side were companies like Red Hat and Suse Linux and the now-defunct VA Linux Systems, who had received massive capital infusions and needed to find ways to differentiate their offerings to make money. This, it was feared, would result in Linux distributions that did not work well with others. Interestingly, Sun was facing those same challenges with Java — companies such as IBM, Oracle, BEA, Gemstone, Bluestone and about 25 more agreeing to “cooperate on standards, compete on implementation.” Open source was too immature, and too fractured, to be taken seriously in Fortune 500 companies. Back then, they purchased their IT from IBM, or SAP, or Siemens, or Microsoft, and paid

On the one hand, this created a shadow IT problem. On the other, it took open source out of the shadows.

big dollars for service. Back then, not every company was a software company. At that Linux World, as compared to the other two technology events I had covered in my life to that point, I noticed a decidedly different look among the attendees. Gone were the businessmen. In their place were overcaffeinated, underbathed, zombie-looking creatures called “software developers.” They were scary. They preferred solitude and darkness. They were pallid and showed signs of eating poorly. Surely, businesses would never bet the farm on people like these. They were hackers, for crying out loud! But then, as the years rolled on, something magnificent occurred. More, and less restrictive, licenses for using open-source software emerged. The Apache license gave developers the opportunity to alter open-source code and not have to return their modifications to the broader community. This made using open source a viable option for businesses. Suddenly, enterprise developers were finding that they could solve their own problems, use software created by their peers and modify it to create real business value. On the one hand, this created a shadow IT problem. On the other, it took opensource out of the shadows. Companies began to form around open-source projects, offering premium service and functionality on top of the projects. It exploded. I remember talking to other technology reporters in 2000, asking if they thought Linux had a commercial future. Some saw the uptake in server rooms and were certain of it. Others believed Linux advocates to be nothing more than anti-vendor zealots and hobbyists who would rather write software themselves than pay for it, and that’s where open source would remain. Well, it’s easy to see that open-source has won the day. Yes, there still is room for packaged software. Microsoft’s Office 365 is big business, and Oracle still reaps big dollars from its ERP installations, to highlight but a couple of examples. But every year, reports show that more open-source code is making its way into enterprise software. The early vision of those pioneers highlighted in this issue’s special report has come to fruition. I’m sure they are most satisfied. z


Full PageAds_SDT013.qxp_Layout 1 6/22/18 12:52 PM Page 47


Full PageAds_SDT013.qxp_Layout 1 6/22/18 12:52 PM Page 48


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.