SD Times - January 2019

Page 1

FC_SDT019.qxp_Layout 1 12/18/18 1:53 PM Page 1

JANUARY 2019 • VOL. 2, ISSUE 19 • $9.95 •

003_SDT019.qxp_Layout 1 12/18/18 2:10 PM Page 3





News Watch


Predictions for 2019


Instana APM Enables Total Traceability


SignalFx announces Microservices APM to speed troubleshooting for DevOps


OzCode makes debugging available as a service

Peace in a POD 2019: The Year of the Value Stream Data decentralization allays privacy concerns

page 9

page 26




2018: Kubernetes solutions take over



Ethics at the fore of AI conversations


Java changes things up in 2018


Microsoft takes Azure to the next level


The conversation about Agile and DevOps is far from over


Security continues to be a black cloud for businesses

41 42

GUEST VIEW by Altaz Valani The modern security hero is a developer ANALYST VIEW by Peter Thorne Developer or User? INDUSTRY WATCH by David Rubinstein Digital transformation is all about the timing

BUYERS GUIDE Digital transformation sparks low-code adoption page 33

JavaScript has come a long way and shows no sign of slowing page 29

Software Development Times (ISSN 1528-1965) is published 12 times per year by D2 Emerge LLC, 80 Skyline Drive, Suite 303, Plainview, NY 11803. Periodicals postage paid at Plainview, NY, and additional offices. SD Times is a registered trademark of D2 Emerge LLC. All contents © 2019 D2 Emerge LLC. All rights reserved. The price of a one-year subscription is US$179 for subscribers in the U.S., $189 in Canada, $229 elsewhere. POSTMASTER: Send address changes to SD Times, 80 Skyline Drive, Suite 303, Plainview, NY 11803. SD Times subscriber services may be reached at

004_SDT019.qxp_Layout 1 12/17/18 4:52 PM Page 4


Instantly Search Terabytes EDITORIAL EDITOR-IN-CHIEF David Rubinstein NEWS EDITOR Christina Cardoza

dtSearch’s document filters support: + popular file types + emails with multilevel attachments + a wide variety of databases + web data


& " # " !$ # % + efficient multithreaded search + #) % $ " $ $ + forensics options like credit card search

Alyson Behr, Jacqueline Emigh, Lisa Morgan, Jeffrey Schwartz CONTRIBUTING ANALYSTS Cambashi, Enderle Group, Gartner, IDC, Ovum


Developers: + # " NET, C++ and Java; ask about new cross-platform NET Standard SDK with Xamarin and NET Core + # " '# %( $ " $ + # $ # " " % " $ ## $ *%" "






Visit for + % " # " & '# # #$% # + % ) % $ $ "!" # developer evaluations

The Smart Choice for Text Retrieval® since 1991 1-800-IT-FINDS



D2 EMERGE LLC 80 Skyline Drive Suite 303 Plainview, NY 11803

Full Page Ads_SDT019.qxp_Layout 1 12/17/18 5:47 PM Page 24










006,7_SDT019.qxp_Layout 1 12/17/18 4:52 PM Page 6


SD Times

January 2019

NEWS WATCH Docker Desktop Enterprise unveiled at DockerCon Docker is giving enterprise uses a new way to create and deliver container-based applications on their desktop. The company announced Docker Desktop Enterprise at its DockerCon conference in Barcelona last month. Docker Desktop Enterprise is designed for Mac and Windows. With it, developers can work with the framework and programming language of their choice, and IT has the flexibility to securely configure, deploy and manage developer environments, the company explained. For IT teams, features include the ability to present developers with customized and approved application templates, and deploy and manage the solution with preferred endpoint management tools using standard MSI and PKG files. For developers, features include configurable version packs and an application design interface for templatebased workflows. The new designer comes with foundational container artifacts to help developers new to containers get started right away, the company explained.

Amazon releases no-cost distribution of OpenJDK Amazon wants to make sure Java is available for free to its users in the long term with the introduction of Amazon Corretto. The solution is a nocost, multi-platform, production-ready distribution of the Open Java Development Kit (OpenJDK).

Amazon goes deeper with serverless Amazon is bolstering its serverless solution AWS Lambda with new serverless development features. The company announced Lambda Layers and Lambda Runtime API at its re:Invent conference. Lambda Layers is a new feature for managing code and data across multiple functions. According to Amazon, it’s very common to have code shared across functions whether it is custom code used by more than one function or a standard library. “Previously, you would have to package and deploy this shared code together with all the functions using it. Now, you can put common components in a ZIP file and upload it as a Lambda Layer. Your function code doesn’t need to be changed and can reference the libraries in the layer as it would normally do,” Danilo Poccia, evangelist at AWS, wrote in a post. Layers can be used to enforce separation, make function code smaller, and speed up deployments, Poccia explained. To provide an example on how to use layers, Amazon will be publishing public layers for NumPy and SciPy. In addition, there are layers available for application monitoring, security and management from AWS partners like Datadog, NodeSource, Protego, PureSec, Twistlock and Stackery. Lambda Runtime API is an interface that allows users to use any programming language or language version to develop functions. As part of the announcement, AWS is making C++ and Rust open-source runtimes available. It is also working to provide more open-source runtimes such as Erlang, Elixir, Cobol, N|Solid and PHP.

“Java is one of the most popular languages in use by AWS customers, and we are committed to supporting Java and keeping it free,” Arun Gupta, principal open-source technologist at Amazon, wrote in a blog post. “Many of our customers have become concerned that they would have to pay for a long-term supported version of Java to run their workloads. As a first step, we recently re-affirmed long-term support for Java in Amazon Linux. However, our customers and the broader Java community run Java on a variety of platforms, both on and off of AWS.”

Amazon Corretto will be available with long-term support and Amazon will continue to make performance enhancements and security fixes to it, the company explained. Amazon plans on making quarterly updates to the solution with bug fixes and patches as well as any urgent fixes necessary outside of its schedule.

Google’s mobile dev toolkit now stable Google announced the launch of Flutter 1.0 during its Flutter Live event in London last

month. This is the first stable release of Google’s cross-platform mobile development toolkit. It features improved pixel-perfect application support and new widgets for iOS, smaller application sizes, improved app performance and previews of the Add to App and and platform views features. The free and open-source platform’s major focus is on UI and creating “beautiful” applications, Tim Sneath, group product manager for Flutter, wrote in a blog post. This can come at the cost of performance, but Sneath explained that Flutter’s use of native codebases for ARM on both

006,7_SDT019.qxp_Layout 1 12/17/18 5:14 PM Page 7

iOS and Android and hardware acceleration strikes a good balance. Though Sneath said the focus of the 1.0 release was bug fixes, the company included previews of upcoming major features, expected to launch with the next quarterly release of Flutter. These features include Add to App, a feature designed to make adding new Flutter-based functionality to existing apps or migrating apps to Flutter easier; and platform views, which serves a somewhat opposite role by bringing system functions from iOS or Android into the Flutter application in their own frame or view.

Sauce Labs introduces headless browser testing Sauce Labs is looking to speed up software development with the release of a new testing solution. The company announced Sauce Headless, a cloud-based headless testing solution. According to Sauce Labs, headless browsers are becoming more popular as an option for testing web-based apps. “A headless browser is a type of software that can access webpages but does not show them to the user and can pipe the content of the webpages to another program. Unlike a normal browser, nothing will appear on the screen when you start up a headless browser, since the programs run at the back end,” the company wrote in a blog post. Similarly to normal browsers, headless browsers are able to parse and interpret webpages, the company explained. It is able to provide real-browser context without

memory and speed compromises. Sauce Headless is designed to give developers access to a lightweight cloud-based infrastructure, which will be useful when they run into high test volumes early in their development cycle, according to Sauce Labs. The solution is available for Chrome and Firefox browsers in a container-based infrastructure.

Sencha launches free Ext JS Community Edition Sencha is trying to increase JavaScript access for hobbyist developers, startups, and students by releasing its Ext JS Community Edition. Ext JS Community Edition is a free edition of Sencha’s JavaScript tooling, components, and framework targeted at those groups. According to the company, Ext JS Community Edition includes a powerful UI component library that will enable developers to build and maintain cross-platform web and mobile applications. Sencha is making this solution available to individuals and organizations that have under $10,000 in revenue or less than five developers. Tools in Ext JS Community Edition include: • Pre-integrated and tested UI components • Customizable themes • Robust data package • Npm packaging and open tooling support • Sample apps and stencils

Go language heads to Go 2 The first version of the programming language Go was

released more than six years ago on March 28, 2012. While the team has made many updates to the language since then, they have yet to declare a version 2.0. As the team begins to figure out what the future of Go looks like, it has been informally calling this future Go 2. However, instead of being a major release, they say it will arrive in incremental steps. “A major difference between Go 1 and Go 2 is who is going to influence the design and how decisions are made. Go 1 was a small team effort with modest outside influence; Go 2 will be much more community-driven. After almost 10 years of exposure, we have learned a lot about the language and libraries that we didn’t know in the beginning, and that was only possible through feedback from the Go community,” the Robert Griesemer, developer of the Go programming language, wrote in a post. The Go 2 roadmap includes a new proposal evaluation process that will happen stages such as proposal selection, proposal feedback, implementation, implementation of feedback and launch decision.

Microsoft announces new developer tools Microsoft wants to give developers the ability to do more with new tools and features announced at its Microsoft Connect(); 2018. New developer tools announced at the conference included the general availability of Azure Machine Learning and new updates for Visual Studio 2019 preview and Visual Studio 2019 Mac preview. Azure Machine Learning is Microsoft’s tool for developers and data scientists to build, train and

January 2019

SD Times

deploy machine learning models. For Visual Studio 2019, the solution features improvements to IntelliCode for AIassisted IntelliSense, refactoring capabilities, debugging features, and new GitHub pull requests capabilities. Microsoft also announced .NET Core 3 Preview is now available for better performance and the ability to use Universal Windows Platform controls in Window Forms and WPF apps, Guthrie explained.

Angular Console provides simpler UI for Angular CLI A new developer tool is being introduced into the Angular ecosystem to provide a simpler UI for the Angular CLI, the command line interface for the open-source front-end web application platform. Angular Console has been under development for the last three months by Angular development consultation company Narwhal and the Angular team. It was designed to provide utility for Angular developers of every skill level and adheres to the Angular CLI core concepts, Daniel Muller, lead designer and front-end architect on the project, explained in a blog post. The core concepts of the solution include workspaces, “which can be thought of as being analogous to an individual git repository;” projects, “which represent a reusable library;” and architect commands, which “define the manner in which a project should be built, served, linted or tested,” Muller wrote. The console provides UI elements that help developers more quickly and plainly interact with these command line features. z


Full Page Ads_SDT019.qxp_Layout 1 12/18/18 1:25 PM Page 8

009,11,12_SDT019.qxp_Layout 1 12/18/18 2:27 PM Page 9

evelopment teams thrive on precision. They need specific requirements to write clean, tight code. They need to write tests that will accurately reveal any flaws in that code. So when development managers are told they need to consider value when developing software, the reaction is, “Well, what do you mean by value?” In 2019, businesses will more regularly look in on their IT processes to determine if what they’re doing drives value for the business. It’s not enough to simply create a great application. If people aren’t using it, or are getting to a certain point and then bailing, that’s not delivering value. But how do organizations measure if


BY DAVID RUBINSTEIN the work they’re doing adds value? Enter the value stream, which those organizations can create to understand how the processes they have in place and the software they create delivers — or fails to deliver — business value. Because of the complexity of modern application development, visibility into the entire process and the entire system is critical. Agile and DevOps enable organizations to deliver and deploy software faster than ever, and putting metrics on top of that begins to answer the question of whether or not each step or each software iteration is delivering value. But the term ‘value stream’ is kind of

January 2019

SD Times

a new, nebulous concept that organizations are trying to get their heads around. In a nutshell, the concept is to measure and evaluate all of the end-toend activities involved in bringing software to market, and getting the most value you can from that. This requires a change in thinking as much as a change in process. Most companies today are not thinking in terms of a value stream. They take a local approach to their business: Let’s do Agile, or let’s do DevOps. What this does is make segments of the value stream go faster, but doesn’t look at the overall process and result. This approach could lead organizations to apply resources to the wrong place, and continued on page 11 >


Full Page Ads_SDT019.qxp_Layout 1 12/17/18 5:46 PM Page 10

009,11,12_SDT019.qxp_Layout 1 12/18/18 2:26 PM Page 11

< continued from page 9

not to whatever their bottleneck is. “The whole goal is to stop us thinking of silos,” said Mik Kersten, CEO of Taskstop and author of the book “Project to Product,” on managing the flow of work from a business perspective. “You’re looking at value from user story open to user story close, to code commit. And DevOps takes it further, to deploy. We need to think like manufac-

their ability to deploy, but the fact that their ratio of UX designers to developers was 1 to 100, or 1 to 200. And of course they didn’t realize that until too late, because they were looking at optimizing a part, such as a deployment pipeline, or how they do Agile, rather than the end-to-end value stream.” One of the key ways to increase value in what you deliver is by eliminating

Companies will need to take a customer-centric view of what they’re delivering, while using Lean manufacturing techniques to eliminate waste from their processes. turers do, from customer request. What they’re pulling and how long it takes to get that to market.” Plutora’s Jeff Keyes, director of product marketing, defined value stream this way: “Value stream mapping is an exercise of looking at software delivery not from a deployment pipeline, the Gary Gruber style, but from a ‘how do I get software from the point of ideation’ — looking at it from the customer perspective. The way that I would define value stream is it’s purely looking at the exercise of construction of that product, which is software and software service, for the customer from the customer perspective. And the point of value stream management is managing your software delivery from that point of view. Success isn’t the fact that you build code faster, or have X coverage. Success is the fact that you had this idea and now it’s in the customers’ hands, and there’s tools out there to see how the customer is using it and you can incorporate it into your [development cycle].” Kersten added, “If you’re doing a transformation, and you want your mobile apps to get out faster, you might have the knee-jerk reaction many organizations have had, ‘let’s hire more developers, let’s get agile going, let’s get CI/CD going.’ I can’t tell you how many organizations I’ve seen do that, and then at the tail end, they realize that their bottleneck was not the developers, or

waste. Lean manufacturing operations already use value streams, and it’s now making its way into IT. “When you’re doing value stream mapping, you look at two things: valueadded work, and non-value-added work,” according to Jim Azar, CTO at Orasi, which owns the integration platform provider ConnectALL. “And the goal is to use Lean principles around waste to remove non-value-added work. Value stream integration comes into play because copying information from one software tool to help you develop into another is non-value-added work. But there’s other non-value-added work inside those processes as well. Some of the bigger companies have been putting in habits and methods that aren’t relevant anymore. If they can remove those, they can make time to market and hit their needs as well.” Flint Brenton, CEO of CollabNet VersionOne, said organizations need to tie their development platform to the business to derive value. “You have to give visibility in to the process from development to deployment, and create a value stream model with both the developer point of view and the business point of view,” he said. “There’s no other way to benefit from enter-

January 2019

SD Times

prise value stream managementg than by tapping into the full promise of scaling Agile and DevOps orchestration. Value stream enables that due to its unique way of bringing together planning, Git version control and DevOps to close the gaps and remove barriers to smooth value streams that span the enterprise.” To effectively connect work to the business, according to Tasktop’s Kersten, “the key thing that needs to happen is to switch to a product-oriented paradigm” instead of viewing development work as a project. This, of course, requires a not-insignificant culture shift and its inherent pain. So Anders Wallgren, CTO at Electric Cloud, said, “I would tell people not to change until the pain of not changing becomes greater than the pain of change.”

What should you measure? Once an organization has decided the pain of not changing is too great, and they want to create value streams, one of the first key things to do is to collect data. The first part of that, according to Plutora’s chief marketing officer Bob Davis, involves release planning, coordination and orchestration. “When you’re looking at an enterprise release, you’re having to mix together multiple of these different project teams together to end up with some scope set of features and functionalities across these teams, which — we’d love for them to be all independent with no dependencies, but that ain’t life,” Davis said. Coordinating the activities and workflow around taking releases forward is part of it. Managing test environments is another. According to Davis, only 40 percent of pre-production test environments are in the cloud, and even fewer have any level of automation. This makes it difficult for organizations to move away from the spreadsheets they use to track what’s going on. continued on page 12 >



SD Times

January 2019

< continued from page 11

“DevOps has the standard metrics you’d look for for deployment frequency and failure frequency, you have TTR (time to remediation) plus deployment frequency,” Davis added. “Value stream has additional metrics around cycle time, lead time with the focus of looking at waste.” But tooling has often stood in the way of these transformational efforts, as each department wants to use the tools they always have. This, though, makes it difficult to gain visibility into the overall work effort, unless you can integrate and orchestrate the information coming in from those disparate tools. Kersten said, “There’s a ton of functional specialization in our organizations, and we need to, because business analysts, graphic designers, marketing, developers, infrastructure admins, those are different functions, and they have different tools. I’ve learned it takes a village to build software. We have to recognize that there are all these specializations, and that’s OK, that’s great. In medicine, you used to go, a hundred years ago, to one doctor, and he would attach leeches to you and help heal you that way. But now you go to all these specialists, because the discipline of medicine has become much more complex. The problem with that is that the handoffs between specializations become the bottlenecks. In the U.S., for example, the medical errors that come from handoffs are one of the leading causes of death. When things are siloed and you can’t transfer information in an automated way, those handoffs become the bottleneck.” So, not only do you need metrics to have an effective value stream, you need to have visibility into the development and delivery processes. Plutora’s Keyes said, “Visibility is the key, because without visibility you can’t do a few things — you don’t have any idea of your pipeline, and managing your pipeline becomes very, very fragmented, especially as you go to autonomous teams and to selfdirected, and as that culture shifts, it’s a nightmare. Continuous improvement is impossible. I was working on our overview deck, and took a quote from

Peter Drucker, ‘you can’t manage what you can’t see,’ and I took out manage and say ‘you can’t continuously improve what you can’t measure,’ which is the point.

You have to work at it Designing the value stream is only half the battle, though. To eliminate bottlenecks and waste, there must be an ongoing process of evaluating all you do, to see where more value can be squeezed from the work. Connect-All’s SVP GM Lance Knight said even after organizations have create their value stream maps and design, there’s much more work to be done. “I

Not enough questioning of process and how work is done is occurring. know what my value stream is, I’ve done my measurements, and my time, and I know what my unplanned waste is, but nobody’s saying, OK, now what do I do with that? There’s a lot more to it,” he said. “If it takes me four weeks to get something through an approval process in my company, there’s a problem with that. There’s a lot of unneeded or wasted time in there. Now I need to dig in. When you dig in, you can use what they call a SIPOC (supplier, input, process, output, customer). So what’s coming in, what are all these things coming in, and why do I need all these to run that process? You can go deeper into value stream and Six Sigma stuff. There’s not like this one place you can go and say, here’s how you remove waste from the organization.”

Another part of the waste removal process is that organizations that have been in business a long time have had practices — habits — in place for a long time, so change is hard. Not enough questioning of process and how work is done is occurring. Knight described his time in IT at a 100-year-old aerospace company in upstate New York called Utica Drop Forge. “They were doing things that they had to do for ISO-9000, called a corrective action,” he recalled. “They were living that corrective action for 30 years. So I would go out to this injection dye machine, and the guy would be sitting there and he would be writing the numbers on paper, then sitting down and putting them into a computer. “I was like, ‘Why are you writing it on paper still?’ Well, the guy said, ‘20 years ago, Pratt & Whitney filed a corrective action because they weren’t getting this information and I have to write it on paper.’ “Well, how are they getting this information now?” Knight asked the man. “I email them in a spreadsheet,” was the reply. “So, why are you writing it down?” Knight called this over-processing waste. “That’s what happens in these big organizations,” Knight explained. “They can’t get lean because they can’t figure out how to remove their own creative processes.” He went on to say that organizations have to know that that is what they’re looking for, but it’s not something that someone in the organization can often do. The solution is to bring in an outside change agent. Plutora’s Davis said, “People focus a lot, when they talk about value stream, on the value delivered at the end of the day to the customer, and tying that to the business. And while that’s obviously critical, and perhaps even where the whole thing needs to start, we see the notion of value being relevant throughout the entire organization. So, even at a practitioner level, there’s value in understanding better what’s going on in and around my adjacent world, to assess whether I’m getting better, and adding value to the process.” z

Full Page Ads_SDT019.qxp_Layout 1 12/17/18 5:46 PM Page 13


14,15_SDT019.qxp_Layout 1 12/17/18 4:50 PM Page 14


SD Times

January 2019

Jim Barkdoll, CEO of TITUS Regulations, fines and consequences…but good data stewardship is on the horizon. Now that GDPR is being fully enforced, we will see the first big fines and consequences announced related to regulations. This will result in two things — a significant bump in organizations developing best practices to ensure regulatory compliance and data safety, as well as more countries and individual states passing regulations in an effort to keep data safe.

Abby Kearns, executive director at Cloud Foundry Foundation 2019 will be a year of consolidation in the cloud ecosystem, from startups to publicly traded companies. The recent enormous acquisitions we’ve seen across the industry — such as IBM and Red Hat, Microsoft and GitHub, and most recently VMware and Heptio — are the latest confirmation of one thing: open source continues to be the driving force behind innovation for companies of all sizes. In 2019, we’ll see a deeper investment in open source technologies to power innovation, new entrants, and more exits across the entire cloud ecosystem.

Predictions for 2019 Nils Puhlmann, chief trust and security officer at Twilio The conversation around data breaches will need to shift from “who had a breach” to “who is managing their risks effectively.” Every company has vulnerabilities and is at risk of a data breach. The public perception of companies who have a breach will need to shift so that we are evaluating companies not based on whether or not a breach is possible or has occured but instead, to what degree, and what processes they have in place to understand and mitigate their risks and how much effort they have spent to reduce their risks.

Tomer Shiran, co-founder and CEO of Dremio Data-as-a-Service is the next evolution in analytics. We are now 10 years into the AWS era, which began with on-demand infrastructure billed by the hour, and has now moved up through the entire stack to include full applications and every building block in between. Now companies want the same kind of “on-demand” experience for their data, provisioned for the specific needs of an individual user, instantly, with great performance, ease of use, compatibility with their favorite tools, and without waiting months for IT. Using open source projects, open standards, and cloud services, companies will deliver their first iterations of Data-as-a-Service to data consumers across critical lines of business.

Murli Thirumale, co-founder and CEO of Portworx AI and automation will change the economics of IT. Much of DevOps is still driven by people, even if the infrastructure itself is becoming programmable. But data volumes are growing so fast and applications evolving so quickly, the infrastructure must be nimble enough that it doesn’t become the bottleneck. We’ve already replaced many storage and network admins. In 2019, infrastructure will become increasingly programmable, and AI-based machines will predict storage and compute needs and allocate resources automatically based on network conditions, workloads and historical patterns.

Simon Peel, chief strategy officer at Jitterbit We are witnessing an industry shift in which separate markets for API management solutions and integration solutions are converging — and this is setting off a string of acquisitions on both sides of the fence. I believe we will see more API management vendors and pure-play integration vendors merge or get acquired — or at the very least create partnerships that ensure all their customers have access to API management solutions and integration solutions that are somehow combined together.

Monte Zweben, CEO of Splice Machine Hadoop adoption dwindles and growth of existing Hadoop clusters grinds to a halt. Data lakes have largely failed. At a recent CDO conference, 90% of a session of CDOs report they have implemented a data lake but only 40% state they are getting value. The reason is that that Hadoop is really hard to operationalize and most first attempts just dumped data onto a lake and did not curate it or think about how to use it operationally. As a result, people return to SQL. The success we see plus others such as AWS and Snowflake demonstrate this return to industry-standard SQL.

14,15_SDT019.qxp_Layout 1 12/17/18 4:50 PM Page 15

Phil Odence, general manager of Black Duck On-Demand at Synopsys In 2019, companies will start to become sensitive to their developers’ use of calls out to third-party APIs. It’s a blind spot in the vast majority of IT organizations, similar to the way that open source was ten years ago. Most companies understand the importance of ensuring that the APIs they publish are secure from outside attack, but few are even tracking their own code’s use of web services via calls to third-party APIs from the inside out. Although there are other legal and business risks that come with reliance on third-party services, the visibility will likely arise from companies having to account for confidential data they are inadvertently passing to unknown and untrusted sources outside their firewalls.

Laurent Bride, CTO of Talend Serverless will move beyond the hype as developers take hold: 2018 was all about understanding what serverless is, but as more developers learn the benefits and begin testing in serverless environments, more tools will be created to allow them to take full advantage of the architecture and to leverage functions-asa-service. Serverless will create new application ecosystems where startups can thrive off the low-cost architecture and creatively solve deployment challenges.

January 2019

SD Times

Mark Troester, VP of strategy at Progress While traditional low-code vendors will attempt to re-architect their monolithic platforms, they will be disrupted by high productivity platforms that are designed to be cloud-native with support for serverless and microservices. Combining serverless and low-code will allow organizations that are under extreme pressure to meet the demands of consumer-grade multi-channel experiences with existing JavaScript developers.

Carlos Meléndez, COO of Wovenware BI is officially declared dead. The term, Business Intelligence, has been around since 1958 (long before it could be traced as a search term) and it is showing its age. In 2019 the term will finally give way to Business Insights, marked by less of a focus on dashboards and reports, and more toward outcome-driven analytics — measuring analytics according to outcomes. Mark Curphey, VP of strategy at Veracode In today’s DevOps-centric organizations, we’ll see a greater shift to automatically test every code change to allow development teams to find and fix flaws early in the software development lifecycle, which saves significant time for both developers and security personnel. This is fundamental to DevSecOps, which I believe will become the dominant model for development teams. The most active DevSecOps programs fix flaws more than 11.5 times faster than the typical organization, due to ongoing security checks during continuous delivery of software builds, largely the result of increased code scanning. This means organizations adopting DevSecOps are operating both more efficiently and more securely.

Mark Levy, director of strategy at Micro Focus 2019 AI and Machine Learning (ML) will converge with automation and will revolutionize DevOps. As DevOps practices focus on increasing operational efficiency, this upcoming convergence of ML, AI and automation will present a significant advantage for companies utilizing DevOps. New and adaptive automation systems will give companies a competitive edge as teams following DevOps practices will be able to make real-time decisions based on real-time feedback. Derek Choy, CIO for Rainforest QA All job interviews for developers will include a question about security experience: Developers who have job interviews next year will see a new question added to the usual list. For the first time, we will see virtually all businesses incorporating questions about security in coding. Engineers applying for jobs should highlight this experience to increase their chances. Deep security knowledge won’t be a requirement for all roles, but DevOps managers will increasingly prioritize those with security experience when they make their hiring decisions. The issue is simply too critical to ignore.

Chris Huff, chief strategy officer at Kofax RPA products will continue to focus on unstructured data: Vendors will align with the macro shift of moving from data collection to data analysis, which will drive a focus on Cognitive Capture, NLP and AI algorithms to transform unstructured data into structured data that can be scored and delivered back to the enterprise to support improved decisionmaking. Companies [will] begin to create formal RPA roles: RPA at scale requires new roles such as Bot Trainer, Bot Developer and RPA Manager, largely filled with existing employees who have been upskilled. Companies will also begin to market their automation programs in recruiting efforts to attract talent.


016-20_SDT019.qxp_Layout 1 12/17/18 5:16 PM Page 16


SD Times

January 2019

2018: Kubernetes solutions take over


This year was a big year for containers, and in particular Kubernetes, with several of the major cloud providers offering new Kubernetes solutions. Kubernetes graduated from the Cloud Native Computing Foundation in March. Kubernetes was the CNCF’s first project and was the first opensource project to graduate as well. Kubernetes 1.1 was released at the end of that month, featuring a beta of the Container Storage Interface (CSI). In August, Google announced that it was investing $9 million in Kubernetes development, to be spent over

the next three years to cover the infrastructure costs of developing Kubernetes, such as running CI/CD pipelines and providing the container image download repository. Google announced in May that container runtime containerd was generally available. Containerd was developed by Docker and then donated to the CNCF. At Google Cloud Next 2018 in July, Google revealed GKE (Google Kubernetes Engine) On-Prem. The solution enables users to modernize existing apps without having to move to the cloud. Microsoft also released a Kubernetes solution: Azure Kubernetes Serv-

ice (AKS). AKS enables teams to deploy and operate Kubernetes. The company also released Azure Container Instances (ACI) for Linux and Windows containers. Amazon also joined in with its own solution for AWS: Amazon Elastic Container Service for Kubernetes (Amazon EKS). Early in the year, Pivotal and VMware announced the general availability of their Pivotal Container Service (PKS), which enables operators to run Kubernetes at scale. Atlassian released an open-source Kubernetes autoscaler called Escalator. Escalator solves many of the issues associated with autoscaling, such as clusters not scaling up or down fast enough. In July, Istio was launched. The opensource project was initially formed by Google, IBM, and Lyft as a way to reliably connect, manage, secure, and monitor Kubernetes-based clusters of microservices. Istio 1.0 defined a way of managing traffic between microservices, provided standard enforcement of access policies, and provided a uniform method for aggregating telemetry data for monitoring and management. In May, Red Hat entered into several

Ethics at the fore of AI conversations BY IAN C. SCHAFER

As the end of 2018 approached, many artificial intelligence technologies like visual testing, chatbots and language recognition had matured to the point of ubiquity. Two years since SD Times’ “Year of AI,” the conversations around AI and machine learning have shifted further and further away from potential applications and surprising new uses of the tech. Now the topic on the minds of everyone, from developers and analysts, to the layman unsure about how much he trusts self-driving cars, is the ethics of AI — where it is ethical to apply AI, whether the bias of their creators can spoil their decision-making capabilities and whether the people who are displaced from their careers by automation will have an alternative.

In June, after pulling out from a Pentagon-commissioned military AI contract after protest from within, Google laid out specific principles for what they consider ethical AI development, and others have followed suit. In October, MIT pledged $1 billion towards advancing AI by bringing computing and AI to all fields of study, hiring personnel, and creating educational and research opportunities in the field. In the announcement, Stephen A. Schwarzman, CEO of Blackstone and one of the backers of MIT’s initiative said, “We face fundamental questions about how to ensure that technological advancements benefit all — especially those most vulnerable to the radical changes AI will inevitably bring to the nature of the workforce.”

Google is addressing this with its funding, launched in July, of $50 million through its branch for nonprofits who are preparing for that scenario. This includes training, education and networking to help people gain the skills that it says will be required in a future workforce, and which it say aren’t as common as they need to be, as well as support for workers in low-paying roles that might be made obsolete. When DARPA announced in August that it would be investing in exploring the ‘third wave’ of artificial intelligence research, the department said that the focus would be on making AI more able to contextualize details and make inferences from far fewer data points by recognizing how its own learning model is structured. The example DARPA gave

016-20_SDT019.qxp_Layout 1 12/17/18 5:16 PM Page 17

2018: THE YEAR IN REVIEW partnerships aimed at accelerating adoption of microservices and containers. The company partnered with Microsoft to provide a jointly managed OpenShift offering, empowering developers to run containerbased apps in Azure and on-premises. Red Hat also announced plans to support NGINX on Red Hat Enterprise Linux and Red Hat OpenShift Container Platform. Earlier in the year, Red Hat had also acquired Kubernetes and container solution provider CoreOS. After the acquisition, CoreOS open sourced the Operator Framework, which is a toolkit for managing Kubernetes applications “in a more effective, automated, and scalable way.” The next month, it released Operator Metering in order to offer a way of gaining insights on the usages and costs of running Operators. The Open Container Initiative (OCI) also released the Distribution Specification project. The project was created to standardize container image distribution. “With the booming development in container and cloud native technologies, the community needs a reliable industry standard for distribution to allow for increased interoperability along with a neutral home to evolve the specification,” said Chris Aniszczyk, executive director of OCI. z

was in the image recognition of a cat, which instead of relying on thousands upon thousands of images of cats to pick out another cat, as in the training data and examplefocused “second wave” of AI, an AI would be able to pick out the cat from noting that the image had fur, whiskers, claws etc. The MIT-IBM Watson AI Lab announced similar projects in development back in April that focused on training AI to recognize dynamic action in video. The end-goal was to train an AI to build analogies and interpret actions and dynamics. Gartner places a focus on digital ethics and privacy at number nine on its list of predictions for 2019, and as the industry moves towards this new wave, the ethical ramifications of emerging technologies are predicted to start being considered earlier and earlier. “Shifting from privacy to ethics moves the conversation beyond ‘are we compliant’ toward ‘are we doing the right thing,” Gartner wrote. z

January 2019

SD Times

Java changes things up in 2018 BY JENNA SARGENT

This year was a big year for Java because of the changes to the language’s release schedule and the transfer of Java EE to the Eclipse Foundation. Last year, Oracle announced that it would be releasing major versions of Java twice per year, and Java 10 was the first release in that new schedule. Java 10 was released in March and included features such as extending type inferences to local variable declarations, GC parallelization, optimized startup time, and the ability to use Graal as an experimental JIT compiler on Linux/x64. Java 11 was released in September and was a Long Term Support (LTS) release, which means that it will be supported by Oracle via security and bug-fixing updates until at least 2026. Sometime between the release of Java 10 and Java 11, the JVM Ecosystem Survey Report revealed that Java 8 was still the most widely used version of Java. The report found that 79 percent of developers use Java 8, 4 percent use Java 9, and 4 percent use Java 10. In February, Java EE was renamed to Jakarta EE after being moved to the Eclipse Foundation from Oracle. The name is a reference to the Jakarta Project, which was an early Apache open-source project. Other renamed Java projects include Glassfish, which is now Eclipse Glassfish; The Java Community Process, which is now the Eclipse Working Group; and Oracle development management, which is now Eclipse Enterprise for Java Project Management Committee. In March, Oracle split off JavaFX into its own module. It was previously part of the JDK, and will continue to be supported as part of JDK 8 until at least 2022, but starting with Java 11 it was available as its own module. Oracle revealed that it would work with third parties to make it easier to maintain JavaFX as an open-source module. Other cuts made by Oracle include removing support for Applets in 2019 and removing Java Web Start starting with Java 11. According to the company, Java Web Start will be supported in Java 8 until 2025, and products with Web Start dependencies will be supported on a to-be-determined timeline. In June, the Eclipse Foundation released the latest version of the Eclipse IDE. Eclipse Photon expanded on polyglot capabilities. New features include C# editing and debugging capabilities, support for Java 10 and Java EE 8, dark theme improvements, and support for building, debugging, running, and packaging Rust apps. The next month, Google released Jib, which is a method that Java developers can use to containerize applications. The reasoning behind creating Jib was that Java developers are often not container experts, making it difficult to containerize their apps. Amazon also released a no-cost distribution of OpenJDK in an effort to make sure that Java is available for free to its users in the long term. Amazon Corretto is available with long-term support, and Amazon will continue making performance enhancements and security fixes. z


016-20_SDT019.qxp_Layout 1 12/17/18 5:17 PM Page 18


SD Times

January 2019

Microsoft takes Azure to the next level BY CHRISTINA CARDOZA

Microsoft first developed its cloud computing service Azure with a mission to let developers and organizations do more in the cloud. Over the last year, the company has brought the service beyond just the cloud to DevOps, blockchain technologies, and the opensource community. In March, Microsoft decided to release its Azure Service Fabric to the open-source community. The distributed systems platform was designed to package, deploy, and manage microservices and containers. The move to open source was to enable others to participate in the development and direction of the platform. Microsoft also announced big plans for Azure and the Internet of Things with a $5 billion investment in IoT revealed in April. As part of the invest-

ment announcement, the company made a number of new profolio updates such as providing a basic tier or “device to cloud telemetry” tier to its Azure IoT Hub solution. In September, the company held its annual Ignite conference around three primary themes: IT security, AI and data, and IoT and edge computing. At the conference, the company announced Azure confidential computing to protect data in the cloud, new capabilities for Azure Cognitive Services, Azure Digital Twins, and general availability of Azure IoT Central. Azure came to blockchain technology in November with the release of the Azure Blockchain Development Kit, focused on connecting, integrating and deploying smart contracts and blockchain networks. Some key features of the kit include key manage-

ment, off-chain identity and data, monitoring, and messaging APIs. Microsoft ended its 2018 Azure journey with the release of Azure DevOps, a set of DevOps tools for collaboration and delivering high-quality software faster. Azure DevOps was first announced at the end of 2017, and it took Microsoft almost a year to bring the product full circle. It was introduced in September of 2018 with Azure

The conversation about Agile and BY CHRISTINA CARDOZA

By now, Agile and DevOps are proven methodologies and well known throughout the software development industry, but their stories are not over yet. Although the idea of Agile has been around for over a decade now with DevOps following closely behind it, there are plenty of organizations still looking to expand and improve their implementations. Forrester ended 2017 with a prediction that 2018 would be the year of enterprise DevOps. A couple months later, in February of 2018, XebiaLabs announced a $100 million round of funding to enter into a new era of enterprise DevOps. Towards the end of the year, XebiaLabs announced enterprise DevOps for the cloud at the DevOps Enterprise Summit in London to help organizations move apps and services into the cloud era. CloudBees followed suit with a new

investment in Kubernetes technology for enterprise DevOps. The investment provided full support for Kubernetes in CloudBees Jenkins Enterprise. GitLab declared in March that 2018 would be a big year for DevOps, with its 2018 Global Developer Survey revealing the demand for DevOps continues to grow. The company would go on later that year to raise $100 million for its

DevOps vision. The company’s Concurrent DevOps vision aims to break down barriers, build features for each DevOps stage in one app and provide the ability to manage, plan, create, verify, package, release, configure, monitor and secure software more easily. The company’s 11.0 release came with the general availability of Auto DevOps, a concept aimed at helping developers deliver their ideas to

016-20_SDT019.qxp_Layout 1 12/17/18 5:17 PM Page 19


Pipelines, Azure Boards, Azure Artifacts, Azure Repos and Azure Test Plans. The company explained Azure DevOps was an evolution in its Visual Studio Team Services. As part of the release, VSTS users were upgraded to Azure DevOps. Throughout the next year, the company will be working on updating its DevOps tools for better planning, collaboration and faster shipping. Outside of Azure, Microsoft sur-

prised the industry with its June announcement that it was acquiring the software development platform GitHub for $7.5 billion. “Microsoft is a developer-first company, and by joining forces with GitHub we strengthen our commitment to developer freedom, openness and innovation,” said Satya Nadella, CEO of Microsoft. “We recognize the community responsibility we take on with this agreement and will do our best work to empower every developer to build, innovate and solve the world’s most pressing challenges.” That same month, Microsoft teased the upcoming release of Visual Studio 2019. VS 2019 aims to be a faster, more reliable, more productive, and easier to use version of Visual Studio with features such was improved refactoring, navigation, bugging capabilities, faster solution load, and faster builds. At its Connect(); 2018 conference in December, the company announced the latest versions of Visual Studio 2019 preview and Visual Studio 2019 Mac preview. The previews included improvements to IntelliCode for AI-assisted Intel-

boarding capabilities and object tagging for customer reporting and improved searchability. Meanwhile, in the Agile space, businesses struggled with going back to the true meaning of Agile. According to Atlassian’s head of R&D Dominic Price, somewhere along the way we lost what it was meant to be Agile. “A lot of teams have been going through the motions and keeping the rituals because ‘that’s what we’ve always done.’ Instead of looking at why they started doing something or how it adds value to keep using a practice or framework, the team has gone on auto-pilot,” he said. The Agile 2018 conference in August aimed to change the conversation from worrying about the process to worrying about what users were getting out of the process. “Businesses say we want to build better products and if we happen to do that faster, then that’s great,” sain

SD Times

liSense, refactoring capabilities, debugging features, and new GitHub pull requests capabilities. Also announced at the Connect (); 2018 conference was the general availability of Azure Machine Learning, .NET Core 3 Preview, Cloud Native Application Bundles for distributed apps, Azure Kubernetes Service virtual node public preview for serverless Kubernetes, and Python support for Azure Functions. Additionally, the company opened its .NET Foundation to the open-source community by inviting them to take a more active role and help guide the foundation as well as build the ecosystem. “Now, more than ever, we’re moving towards a world of ubiquitous computing where technology is responsible for transforming every consumer and business experience,” Scott Guthrie, executive vice president for the cloud and enterprise group at Microsoft, wrote in a blog post. “For developers, the opportunity to use technologies like AI, IoT, serverless compute, containers and more has never been greater.” z

DevOps is far from over production faster and accelerate the enterprise DevOps adoption. ServiceNow announced a new enterprise DevOps offering in May as part of its app development services. Enterprise DevOps was built to include both IT Ops and development teams. IT Ops wants control, transparency, security and governance while developers want agility, flexibility and speed, ServiceNow CTO Allan Leinwand explained. JFrog also turned its focus to enterprise DevOps throughout the year with the release of Enterprise++, a $165 million round of funding to accelerate universal DevOps within the enterprise, and the acquisition of DevOps consulting company Trainologic. Electric Cloud wrapped up the DevOps year with the release of ElectricFlow 6.5, designed to enable users to do “DevOps your way” with new Kanban-style pipeline views, CI dash-

January 2019

Shannon Mason, VP of product for CA Agile. “But if you ask them what did they end up doing, they find they built a lot of things faster, but they don’t know if they are actually building things that are better.” Instead of concerning themselves about the output of Agile, businesses have started to look more at the outcome. Outputs measure what you produced while outcomes measures the results or value of what you produced. The Agile mindset shifted from velocity to business agility, with businesses more away of what they were building and how satisfied customers were. “If you build the right thing, you are more likely to be a successful business,” said Zubin Irani, CEO of cPrime. “People are very focused on velocity and output, and they are not focused on the quality of that ot the business impact. We are not doing Agile to do Agile. We are doing Agile to solve a business problem or take advantage of business opportunities.” z


016-20_SDT019.qxp_Layout 1 12/17/18 5:17 PM Page 20


SD Times

January 2019


Security continues to be a black cloud for businesses BY CHRISTINA CARDOZA

The year would not be complete without a major security breach, and although there are a number to choose from throughout any given year, Marriott ended 2018 with a doozy. The company revealed at the end of November that there had been unauthorized access to its Starwood reservation database for more than four years. This included unauthorized access to travel information, passport numbers, and credit card data — impacting up to 500 million guests who made a reservation at a Starwood property. Marriott has since been investigating the problem, taking steps to address the issue and offered its apology, but we can’t help but wonder what type of safeguards could have been put in place to prevent this. Security becomes more important every year, and 2018 had a story focus on protecting user privacy and putting proper precautions in place to prevent any incidents. The Cost of a Data Breach 2018 report was released in July by Ponemon

Institute and sponsored by IBM Security, which found the average cost of a data breach is $3.86 million globally and has been increasing steadily over the last five years. To address the ongoing problem, the U.S. Securities and Exchange Commission (SEC) updated its six-year cybersecurity guidance to provide new rules for addressing and disclosing data breaches. The European Union’s General Data Protection Regulation also went into effect this year, giving businesses all across the world new perspective on protecting sensitive user data. “For a long time, businesses just collected more data than it needed to and retained it much longer than it should with the hope that someday it is going to provide some kind of value,” said Nigel Tozer, solutions marketing director, EMEA at Commvault. The GDPR was designed to better protect personal data of individuals and hold businesses more accountable to the data they acquire and how they use it. While the GDPR is a regulation

specifically in the EU, it impacts anyone who does any kind of business within the EU. IBM’s chief data officer Seth Dobrin stated he believed GDPR should be treated as a global effort and applied to more than just subjects in the EU. The same month GDPR was put into effect, IBM released its own platform for providing security and compliance capabilities to organizations with the Cloud Private for Data solution. A couple months later, the California Consumer Privacy Act was announced to address data regulation and privacy. The act is designed to give users the right to know all the data a business collects on them, the right to delete their data, the right to say no to the sale of their data and more. The act isn’t expected to go into effect until 2020. Other ways the industry tried to maintain security included the Software Assurance Forum for Excellence in Code (SAFECode) updating its guide on best secure software development practices early in the year. The forum announced the Fundamental Practices for Secure Software Development: Essential Elements of a Secure Development Life Cycle Program (Third Edition) in March with recommendations to improving software assurance programs and encourage the adoption of secure development practices. With more solutions being deployed in the cloud, the Cloud Standards Customer Council (CSCC) announced 10 steps to ensure security for cloud computing. The top steps included securing effective governance, risk and compliance; audit operational business processes; and the ability to manage people roles and identities. The year ended with a group of security executives getting together at the Infosecurity North America conference in New York City to talk about how organizations can improve the effectiveness of their security programs. While different organizations are going to have different strategies, the executives stressed the importance of being tell if a program is effective and educating the entire organizations on how not to put themselves at risk. z

Full Page Ads_SDT019.qxp_Layout 1 12/17/18 5:47 PM Page 21



SD Times

January 2019


Instana APM Enables Total Traceability T

oday’s developers build complex applications designed to run in complex environments. Since the mix of proprietary, third-party and open source software is constantly changing, they need an APM solution that can adapt with their technology while providing full, end-to-end traceability. Using traditional APM solutions, configurations must be done manually if they’re possible to do at all. With Instana, developers get complete traceability throughout the entire tech stack, automatically, instead of traceability that’s limited to what they were able to manually configure. “Developers spend a lot of time and money configuring traditional APM solutions, but they are still unable to see everything that’s impacting application performance,” said Fabian Lange, founder and VP of Engineering at Instana. “Companies don’t want to pay for the high cost of manual configuration when a more comprehensive job can be done faster and more effectively using an automated APM solution.”

Microservices Support Is Essential Most APM solutions were built before the advent of microservices and serverless architectures, so their design is more rigid and limited than modern, automated APM solutions. For example, traditional APM solutions include a set of default functions that don’t work well automatically out-of-the box. Developers must modify the rules, adjust the agents and ensure the agents operate properly in the software ecosystem. Then, when the software mix changes, all the original configuration efforts are lost. Instana never needs to be configured; therefore, it never needs to be reconfigured. “Containerized architectures require developers to adapt their development Content provided by SD Times and

latency, errors and requests, you can better understand how the system works,” said Lange. “If you’re trying to do the same thing manually, you can add tracing where you know your code is executed, but if you don’t know where your code is executed — perhaps you inherited software in a merger or acquisition or it was developed by a third party — you have zero visibility into it. You need complete traceability to understand what’s actually happening while your program is running.” Although open source APM projects are popular, they also require a lot of work. In a Hello World application, tracing can ‘By tracing all the aspects includbe as simple as adding a ing latency, errors and requests, couple of lines of code. you can better understand how However, for highly asynthe system work.’ chronous applications —Fabian Lange, founder and VP of Engineering which are now common, such as those built with users in New York City, Instana could Scala or the Reactive Framework, trachelp you determine that the root cause ing is extremely hard to do manually. In of the problem is a particular service fact, in a recent study, Bernd Harzog, CEO of APM Experts said, “open provider in the area.” Since distributed microservices source manual monitoring is a bad idea.” “All our users really appreciate that architectures use components and technologies that are not used by traditional Instana does the tracing automatically enterprise systems, it’s important to and correctly because they understand understand how requests are routed how really hard it is to do manually,” and whether loads are balance correctly. said Lange. “Our comprehensive views Unlike traditional APM tools, Instana enable them to see the points at which uses a semantic model that accommo- a request starts and ends, as well as the dates mature and emerging technolo- place where the request is passed onto something else for further processing. gies and development methodologies. That’s hard to get right when you’re Instana Is Intelligent using open source tools like Jaeger.” Comprehensive, end-to-end tracing All Instana traces are constantly monhelps users identify and resolve errors itored by Instana’s virtual assistant, Stan, that can occur anytime in complex sys- which applies machine learning and tems. Instana’s machine learning capa- algorithms to all the incoming data. The bilities identify recurring patterns and capability enables developers to identify transaction delays caused by network and address the root cause of errors, latencies and distributed microservices latencies and other performance impacts faster, cheaper and more efficiently. messaging overhead. Learn more at z “By tracing all the aspects including cycles. However, traditional APM solutions don’t allow them to work in an Agile fashion and they may not work in Docker,” said Lange. “Instana does not limit the scope or type of the technologies or methodologies you use.” Instana captures every trace occurring throughout a system, so developers can find random outliers and errors that traditional APM solutions miss. “We capture all the data, not just some of it, so if there’s a problem — anywhere — you’ll see it instantly,” said Lange. “For example, if you’re having trouble with a subset of application

Full Page Ads_SDT019.qxp_Layout 1 12/17/18 5:47 PM Page 23

Full Page Ads_SDT019.qxp_Layout 1 12/17/18 5:44 PM Page 5

025_SDT019.qxp_Layout 1 12/17/18 5:20 PM Page 25

January 2019

SD Times


SignalFx announces Microservices APM to speed troubleshooting for DevOps lems minutes later just doesn’t SignalFx has announced a new cut it,” said Karthik Rau, CEO solution designed to accelerate and sounder of SignalFx. “Custroubleshooting for DevOps tomers expect technology and teams and provide advanced applications to just work. Their real-time analytics. The Sigtolerance for failure is next to nalFx Microservices APM is an zero with the value of their brand application performance moniheld in the balance.” toring solution powered by the SignalFx also explained company’s distributed tracing SignalFx provides real-time threat detections so DevOps teams DevOps teams can ingest metarchitecture and built on top of can address and fix anomalies quickly. rics, traces and events from any its streaming analytics platform source, achieve real-time probfor metrics. According to the company, the solu- lem detection, offer directed trou“Microservices have changed the tion is able to observe every transaction, bleshooting, and integrate the tool game for companies, allowing them to report on every anomaly, pinpoint the within their DevOps toolchain. “A shift to faster, more agile software move faster by breaking up applications most challenging issues, and identify into modules that can be developed and the root cause of critical problems. The delivery is requiring the adoption of updated independently. At the same company decided to develop the APM monitoring and APM tools that provide time, it is now exponentially more diffi- solution because traditional APM sys- real-time alerting, dashboards, and cult to identify and triage problems as tems lacked in capabilities for microserautomation for auto-remediation and they span hundreds of services and vice applications. auto-scaling,” said Stephen Elliott, proevolve rapidly,” said Arijit Mukherji, “The world happens in real-time and gram vice president of management CTO of SignalFx. if something goes wrong, finding prob- software and DevOps at IDC. z BY CHRISTINA CARDOZA

OzCode makes debugging available as a service BY CHRISTINA CARDOZA

OzCode is on a mission to reduce the time it takes to debug a service from days to hours to minutes with its latest debugging-as-a-service solution. OzCode provides debugging extensions for Visual Studio. Its latest OzCode Azure DevOps offering is available through Microsoft Azure cloud. “Studies show that roughly 50 percent of a software developer’s time is spent debugging,” said Omer Raviv, cofounder and CTO of OzCode. “Our goal is to drastically reduce the amount of time spent on debugging, and thereby improve the lives of our fellow developers.” According to the company, one of

the biggest challenges when it comes to debugging is debugging in production. “A bug that is found in production costs infinitely more to fix than a bug found during development or testing. In the best-case scenario, it will simply be much longer to find and solve. In the worst case, the bug will cause material damages and loss of customers,” Raviv wrote in a blog. The solution aims to provide “prebugging” capabilities so developers can find bugs and fix them before they happen. While it is ideal to find bugs before they get released into production, Raviv explained the reality is that bugs are still found in production. To address this, the debugging solution

features integration with other APM solutions such as Application Insights, Raygun and Sentry so Ozcode can extract a snapshot at the time of failure and display the source code in the debugging solution. In addition, OzCode features time-travel debugging so developers can create “what if” scenarios and make live code changes in the browser without having to redeploy or reproduce the scenario, according to Raviv. Other features include automatic redaction of personally identifiable information, live coding to obtain instant verification, analysis of crash dumps, automated tests from production use-cases and full traceability. z


026,27_SDT019.qxp_Layout 1 12/17/18 5:19 PM Page 26


SD Times

January 2019

Peace in Data decentralization will allow users to carry their own data, allaying privacy concerns BY JENNA SARGENT


few months ago, Tim BernersLee, the father of the World Wide Web, announced that he was heading up a new project to reshape the way we interact with the web. His new project, Solid, aims to put control over data back into the hands of users. The ecosystem of Solid is based on the concept of PODs, which are spaces where an individual can keep all of their data, explained Justin Bingham, CTO of Janeiro Digital, a company that is working on the Solid technology. For example, a person might use a POD to store their birth certificate, medical records, and pictures. “Basically all of the things that you would do to an application today that would equate to your digital identity, you retain ownership,” said Bingham. According to Solid’s website, PODs are like “secure USB sticks for the Web, that you can access from anywhere. When you give others access to parts of your POD, they can react to your photos and share their memories with you. You decide which things apps and people can see.” The concept would be particularly appealing to individuals tired of their data being tied up in so many different places while having virtually no control over it once it’s out in the world. Applications would change from being usable through centralized data contained on their own servers to having to ask permission to access data within a POD on a day-to-day, transaction-by-transaction basis, he said.

In addition to changing the way we interact with web apps, the technology has some other practical applications. For example, it would change the way health records are dealt with. In our current system, a patient may have health data all over the place if they’ve seen many different doctors or were cared for at different hospitals. “You have all of this different information that makes the ability to give you good health services more challenging than if you walked in and had all of your data contained on a single pod,” said Bingham. This means that places like hospitals would go from being the custodians of data to an entity that has to request access to certain information based on need. “I think we can all agree that 99.999% of the time, the hospital holding your information is putting themself at risk for no reason at all,” said Bingham. “It’s not like you get any incremental value by that hospital storing your data versus yourself.” A new sort of application would need to be created in order to facilitate this interaction, he explained. It would have to be able to both view the information it is provided with, as well as submit information back to the user. “It’ll be the application’s duty to then take that information and also copy it over to my POD so my POD always has the most up-to-date information,” said Bingham.

026,27_SDT019.qxp_Layout 1 12/17/18 5:19 PM Page 27

a POD There will have to be a shift in the way that applications interact with data. “They’ll go away from a centralized data store contained within an organization to a distributed one that people who are POD owners have the ability to turn access on and off, based on how they feel,” said Bingham. While this model sounds great, getting to this point won’t be as easy as it sounds. In an ideal world, we would migrate all of the data that we’ve ever stored anywhere, said Bingham. But realistically, this pod-based model would probably only work going forward. What that means is that from this

point forward, all of your new data would be stored in a POD. “Now, Apple’s App Store, that didn’t just appear overnight, right? So in my opinion, the stewards of this particular model will not just be individuals looking to pull down a POD and store their individual data and taking ownership back from organizations where otherwise they would feel helpless,” said Bingham. “I think that what you’ll see is a shared interest from some large organizations out there to help reinvent the way they do business today, unlocking incredible new opportunities for them to do business.”

January 2019

SD Times

Those early adopters are going to have to serve up the ecosystem of the future, he explained. Enterprises will need the structure to interact with PODs and data in this new way. Additionally, the security of PODs will be an entirely new industry on its own. Users will need to feel that the data they’re putting in their PODs are secure. “Our digital identity is a currency, as we all know,” said Bingham. “There are people who make a lot of money off of that information and that means that that information needs to be protected as a valuable asset.” Bingham believes that the project will need people who can dream big in order to be successful. “We need dreamers that are capable of seeing the good in something different and see opportunity and doing things in a way that aligns with Tim’s vision of how the web was originally designed,” he said. “It’s not so much just to pay homage to Tim, which is a great thing to do, but also because it just makes good business sense.” He thinks that CEOs will soon start to see and understand this new way of thinking about things. “Forget about the technology for a minute, but just think about the possibilities that a real decentralized web would bring to them and the risk that it reduces for them around being the sole custodian around certain types of information. And that’s a true digital transformation.” z

Inrupt disrupts the web Inrupt is the company behind the Solid technology. It was co-founded by Sir Tim Berners-Lee, the creator of the World Wide Web, and John Bruce, who also co-founded and was the CEO of Resilient before it was acquired by IBM. The company was founded as a way to provide resources for Solid to grow. The Inrupt team has been working to ensure that Solid is becoming robust, With the Solid data browser, all your data are in one place, and you can easily link to and share feature-rich, and ready for wide-scale with anyone you choose, and drag and drop those elements between apps. adoption. “Together, Solid and inrupt will provide new experiences benefitting every web user — and that are impossible on the web today. Where individuals, developers and businesses create and find innovative, life- and business-enriching, applications and services. Where we all find trusted services for storing, securing and managing personal data,” Berners-Lee wrote in a post. z


Full Page Ads_SDT019.qxp_Layout 1 12/18/18 11:57 AM Page 28

029-31_SDT019.qxp_Layout 1 12/17/18 5:42 PM Page 29

January 2019

JavaScript has come a long way and shows no sign of slowing ince its release more than 20 years ago, JavaScript has gone through a lot. But even though it has come so far, the language continues to grow and evolve, and interest in using it is still rising. JavaScript is an incredibly approachable language, which probably has something to do with why it has become so popular among developers. Its robust community continues to add to JavaScript’s


continued on page 30 >


SD Times


029-31_SDT019.qxp_Layout 1 12/17/18 5:37 PM Page 30


SD Times

January 2019

< continued from page 29

rich ecosystem of libraries and frameworks. Other languages have similar ecosystems, but one thing that has propelled JavaScript forward is debugging capabilities available in TypeScript and Chrome, according to Todd Anglin, vice president of product strategy and developer relations at Progress. “I think JavaScript is going to be on this journey forever, as long as people find ways to make software easier to create, more robust, and more secure,” Anglin said. “It’s just a continuum.”

Getting more involved with standardization

how important it is to bring the JavaScript communities together from fractured communities.” According to Anglin, most people never even think about evolving a language as long as they’re using tools that are working for them. “I do think it takes that one percent or 0.01 percent that will see a language that’s working well for a lot of people, has a great ecosystem and libraries, and still say

Improvements in hardware and uses beyond web development will help JavaScript grow even more.

One positive step that should help advance JavaScript even further is that the community is getting more involved in the standardization process. “Now we’re starting to be having more initiative towards making JavaScript what we want it to be, and I think ECMAScript has a lot to do with that,” said Tara Manicsic, developer advocate at Progress. “And the community being more involved in wanting to make the language better instead of complaining about it is a huge step and is something that’s really great.” Recently, the JavaScript Foundation announced that it intended to merge with the Node.js Foundation. According to Manicsic, this was a result of so many developers using Node, not just for JavaScript on the back end, but for front end and new technologies from such initiatives as IoT. “Node is being incorporated more and so you’re seeing more people in the community interested in the foundations merging and trying to see what that means for them as developers,” said Manicsic. “I think they’re starting to realize more just how important standardization is and

that it can be better,” he said. Jory Burson, standards liaison at Web Platform consulting group Bocoup, added that an important thing to consider when evolving a language is knowing when to say no to things. “We’re designing JavaScript as a product, so saying no to a lot of things is often more important than what we end up saying yes to,” Burson said. “By not accepting every suggestion or proposal that gets made, JavaScript becomes a stronger project.” The JavaScript ecosystem contains several libraries that solve problems for subsets of users. Sometimes, a library’s functionality may be added to the language, but there are many things that will be considered by TC39, the ECMA technical committee for JavaScript, before that can happen. According to Burson, the committee may ask questions like: “Does this

solve problems for other use cases? How big of a need is this? If we put this in the language, is this going to be a problem for other contexts?” Another issue is that there is a limited amount of syntax available, Burson added. Once you use up a certain piece of syntax, it is no longer available for another use case. One example of a library’s functionality being added to JavaScript is leftpad, which is used for handling indentation and trimming. The library was taken down in 2016, but many projects used it as a dependency, so when it was removed, it crashed a lot of projects, Manicsic explained. Eventually, that functionality was added to the standardization of JavaScript.

Hardware improvements are coming JavaScript has reached a point in its maturation and evolution where the hardware is what’s holding it back. “I think one of the bigger opportunities with JavaScript is in the hardware,” said Anglin. The industry is at a point where the software engines powering JavaScript are pretty fast and have been optimized for the language. “The optimizations that can take it even further are now showing up in the hardware,” said Anglin. At this point, certain chip manufacturers are optimizing their chips so that JavaScript can be processed faster. “I think what’s nice about that is that it will actually benefit a lot of other scenarios where JavaScript may run in the future, like embedded devices or lowpowered devices, where you don’t necessarily have as much horsepower to drive the software the can make JavaScript faster,” said Anglin. “The things that will allow JavaScript to be faster at that point will actually be in the hardware.” He added that those sorts of optimizations had already been put into place for languages like C and C++, so

029-31_SDT019.qxp_Layout 1 12/17/18 5:36 PM Page 31

seeing this coming for JavaScript shows the position that the language is in today. Currently, there are only a few chip manufacturers starting to bake these optimizations in, but Anglin predicts that given the popularity and increasing usage of JavaScript, other manufacturers will soon start to follow suit.

JavaScript is on the rise in education JavaScript is becoming more commonly used as part of the curriculum in many computer science programs at universities. For a while, many universities used Java as their core language, but it’s now more common to see JavaScript or Python being taught. For example, Stanford University recently switched over their introductory computer science course to be taught in JavaScript instead of Java. “I think we’re going to see more and more universities start to do their curriculum around JavaScript instead of things like Java,” said Manicsic. “Every time somebody wants to learn I always

tell them to start with JavaScript because it has the fundamentals and the core concepts of CS in general, and it teaches you how to think logically just like Java, .NET, and C# do, but it’s much more human-readable.” She believes that because JavaScript is so approachable, more schools will start making it the first step in introducing computer science. In addition to it being more approachable than some other languages, more and more people are hiring for JavaScript developers, and universities recognize that and want to make sure that they are preparing their students for the real world of web development.

Using JavaScript to power data analytics JavaScript has even moved beyond simply being a web development language. In the age of Big Data, being able to analyze all of the data that you have is essential and can help you unlock previously unknown insights. Analytics company Cambridge Intelligence has used

January 2019

SD Times

JavaScript to create a powerful data visualization toolkit. According to Cambridge Intelligence, data visualization can help expose criminals, reveal suspicious behavior, help mitigate risks, and help fight fraud. “If I show you the difference effectively from looking at a table like this, to looking at connections of the same data set, then immediately you can see there’s a richness and a lot of information in the connections themselves that you cannot possibly ever find within a spreadsheet,” said Joe Parry, founder and CEO of Cambridge Intelligence. When people go from looking at data on a spreadsheet to suddenly seeing it laid out in this exciting way, it changes their interest, Parry explained. While Cambridge Intelligence isn’t the only company providing data visualization tools, it is a testament to the power of JavaScript for being able to run all of this and run it on practically any device. z


033,34,36,38,39_SDT019.qxp_Layout 1 12/18/18 4:51 PM Page 33

January 2019

SD Times


Buyers Guide

Digital transformation sparks low-code adoption problem without having to have an IT staff to do it,” he said. Even if your business can afford an IT staff and development team, Perlman explains that there is always going to be way more more work to do than people to do it. “Historically, businesses have always had to prioritize and that meant some things that their customers and other people in other departments in their company wanted to do just didn’t get done because there wasn’t enough people,” he said. “Now with citizen developers and low-code, that burden can be lifted and it allows IT to prioritize the bigger things and it empowers other departments to no longer be reliant or dependent upon IT for everything.”



igital transformation has been a popular buzzword over the last year as businesses try to navigate what the term truly means and how it can be achieved. Research firm Gartner refers to the digital transformation of a business as “the process of exploiting digital technologies and supporting capabilities to create a robust new digital business model.” Over the last couple of years, everything has become digital, explained Geoff Perlman, founder and CEO of Xojo, a cross-platform application development tool provider. “Today, if it can be digital, it is. The result of that is that computers and solutions have become such a critical part of our business lives and personal lives.” Couple that with the ongoing need for speed, and a business ends up with a whole lot of complexity. Businesses need to innovate faster and deliver high-quality solutions and features more frequently, but they can’t keep up with the supply and demand. “Businesses are unable to deploy the technological solutions that they need to continue to remain competitive, continue to service and delight their customers and employees alike,” said Ben Saren, CEO of Dropsource, a low-code mobile development platform provider. To be able to successfully achieve all that, businesses have also been going through a parallel metamorphosis. The idea of implementing and applying low-

code development to business strategies has become more appealing. “Species have had to adapt or they went extinct. Technology is the new changing element that requires adaptation,” said Perlman. For instance, ride sharing services like Uber and Lyft have been taking over taxi businesses because of their ability to provide a well designed application. Blockbuster video pershied after it failed to adopt an online streaming service on time, Perlman explained. “The thing people don’t realize is that most people work for small businesses, and the problem is that small businesses can’t usually afford to have an IT staff,” Perlman explained. The power of low-code is appealing because makes all this technology available to a broader set of people. “With a little bit of time to learn something new, nontechnical people can solve a business

Empowering the people Low-code solutions are more than just drag-and-drop, visual programming interfaces. The features tools possess can give businesses the power to answer a big part of the question of digital transformation, which is the ‘how?’ But before businesses embark on how they should approach low code for their digital initiatives, they need to understand the why and the when, according to Jorge Sanchez, director of product strategy at Appian, an enterprise lowcode and BPM software provider. “It’s not just about the platform you build with, it is about the collaborative design and making sure you are mapping business objectives with the actual value that you are going to derive from the app,” he said. A good low-code platform provider will help you answer the why and when before jumping to the how, Sanchez explained. continued on page 34 >

033,34,36,38,39_SDT019.qxp_Layout 1 12/18/18 1:56 PM Page 34


SD Times

January 2019

Addressing digital transformation weakness: mobile app development Low-code mobile development platform provider Dropsource revealed in the results of a recent survey that mobile was the “achilles heel” of digital transformation. According to the report, 84 percent of digital leaders feel like they are “behind the 8-ball” when it comes to developing mobile apps and supporting business goals. In addition, 25 percent of respondents stated they felt extremely behind the curve and 41 percent believe their inability to develop mobile apps is negatively impacting their ability to compete in the market. “There is that urgency these businesses are feeling to get mobile solutions deployed into their customers’ hands, into their employees’ hands, partners’ hands etc.,” said Dropsource’s CEO Ben Saren. “It is only increasing; and their ability to move quickly, is not there yet by and large.” “Successful digital transformation today requires bold, nextlevel mobile innovation, the report stated. “Technology changes fast, but mobile apps change faster, making digital transformation not only challenging, but immediate.” Saren explained the problem is that development and IT teams

are too often not brought into the conversation sooner. Instead, digital leaders and chief data officers expect their development teams to build something, when in reality teams are not well equipped to deploy those solutions. “The sooner development teams have a seat at the table, the better and more successful you are going to be at generating the kind of buy-in you need within your organization,” said Saren. “Digital leaders, IT and development teams need to work together in a more constructive way and with an open mind so they can come up with the appropriate compromises for solving the problem, because everyone will have to make compromises to get to the end result.” Equipped with a low-code solution, developers can more easily deploy and maintain enterprise-class mobile apps without sacrificing quality, control, performance or user experience, and the elimination of manual development frees them up to work on top-level features, Saren explained. “You have big goals, but live in limbo-land where expectations are high and yet there’s not often enough budget or understanding of what it really takes to succeed in mobile,” the report stated. “It’s time to admit you can’t keep doing things the same way, using the same tools and expect different results.” z

< continued from page 33

Low-code platforms should also enable users to reuse components in a smart way. “Another reason that lowcode platforms are really good at enabling digital transformation is because the speed of change keeps increasing,” said Sanchez. In addition to reusing components, Xojo’s Perlman believes a platform should enable you to reuse your knowledge. Instead of having to take the time to learn the skills a certain environment targets such as JavaScript or HTML, it is a huge benefit if the solution allows you to apply the same skills to other platforms. According to Dropsource’s Saren, some low-code tools provide no runtime platforms, meaning it doesn’t lock the business into anything and it gives them the freedom and control to develop and target different platforms as necessary. This is especially important in today’s digital transformation era as businesses now feel pressure to develop for all the variety of devices on the market. Xojo’s Perlman explained some lowcode solutions also offer a cloud offering that takes care of the security and maintenance so all the user has to do is press deploy. “This is huge for citizen developers, because they can get back

Becoming a “10Xer” Despite the many benefits low-code solutions promise, many professional developers are still hesitant about adding it to their toolbelt. A recent report from Evans Data found developers are slow to adopt low-code/no-code solutions. “While low-code platforms enable speedier development of UI and other common elements through templates, frameworks, and visual design tools, they also put limitations on developers because the very nature of supplied components and methods implies a restriction to adhere to those supplied components and practices. Developers who may have spent many years perfecting their craft are hesitant to discard the flexibility they have when custom-coding their apps for the easier construction and quicker time to market that low code platforms provide,” Evans Data stated in an email to SD Times. However, Jorge Sanchez, director of product strategy for Appian, thinks professional developers are missing the point. According to Sanchez, low-code solutions can not only help developers become more productive, but it can help improve their craft. There is a rare breed of programmers that are considered “10Xers.” “10Xers” are the Mark Zuckerbergs of the world that can build something like Facebook all by themselves, Sanchez explained. With the power of low code, developers can speed up the process of coding, deploying scripts, and troubleshooting, and get back to what they really care about, which is building and delivering the application. “If you like spending long nights and working all weekend just to keep up with everything that is happening, then sure, by all means do so. But if you can do more with less, why not work smarter instead of just harder?” said Sanchez. “If you can build a whole application in a matter of weeks, then there is more you can get to and less weekends you have to sacrifice. You can become a 10Xer yourself because there is so much assistance the platform is giving you.” z

to doing whatever it is that their job title says they need to do,” he said. That, though, does require the user to have some level of trust with the platform, according to Sanchez. Businesses should understand the type of security and operational services a low-

code tool is providing so they know what they have to worry about. Ideally, the tool should enable the user to focus on the feature they are delivering and manage the rest, Sanchez explained. Low-code tools and initiatives should continued on page 39 >

Full Page Ads_SDT019.qxp_Layout 1 12/17/18 5:48 PM Page 35

033,34,36,38,39_SDT019.qxp_Layout 1 12/18/18 1:56 PM Page 36


SD Times

January 2019

How low-code solutions can help advance digital transformation Jorge Sanchez, director of product strategy at Appian In a number of different ways: easier evolution, faster transformation, and continuous platform innovation. When you are working to build an application you care about, there are a number of things you have to worry about. First, you want to make sure you have improved agility. Can you actually deliver on the promise of keeping up with change and evolution? Secondly, you want decreased costs. With low code, developers can quickly become ‘10Xers,’ and business people can actually start implementing applications with services that allow them to do citizen development. So, you can build more with less and, of course, this means a reduction in cost, not just in IT. The people you use to build, design and use the application also become more productive. That brings me to the next point, which is higher productivity. Because you can build more with less or do more with the resources you already have, you find out that instead of IT spending more of their time in the upkeep and maintenance of an application, they can spend more of that time in the innovation and evolution. You also get better

Ben Saren, CEO of Dropsource Time-to-market is a critical success factor of digital transformation. Dropsource helps digital leaders and their teams to accomplish in mere days or weeks what traditionally takes months or years. This dramatic increase in speed means more rapid time-to-market, which greatly expands the range of what can be achieved. These benefits enable companies to deliver on the promise of digital transformation without compromise. Dropsource appeals to the business needs of digital leaders and meets the requirements of even the most scrutinizing developers. Companies that excel at delivering mobile solutions, and those that don’t, can benefit equally from Dropsource’s solutions.

customer experience because you can leverage the capabilities that Appian provides in order to give you the latest and greatest look and feel. It is not just boring software we provide. It is a beautiful, reactive application that people are used to. The newer generations are not used to Excel spreadsheets, but rather the experience that they see with applications like Instagram, Facebook and all of these beautiful solutions. Lastly, we provide effective risk management and governance. When you don’t have to think about all of these issues with each and every single application that you build and you realize that the platform automatically comes coupled with a lot of security, performance and scalability, then you no longer have to worry about ever-changing regulations. Companies spend millions and billions of dollars changing the way their application works just because of new regulations. When you build it on top of a platform that is already compliant, your applications will become compliant. That means that your changes can be done more easily because you don’t have to consider everything.

Geoff Perlman, founder and CEO of Xojo With each passing year, the world around us becomes increasingly digital. Most of the ways in which we communicate with each other are now digital. This profound transformation is having a greater impact on the world than all of our previous transitions combined. It was not long ago that all computing was done only in specific departments inside large companies and universities. Today you can walk into any shopping mall and find toddlers expertly navigating smartphones to find their favorite photo, video or game. So much of what we do now for both work and in our personal lives involves information technology. The result is that more and more of us want to be able to create apps in order to automate the logic of many of the tasks that waste our personal and professional time so we can focus our energy on those things that can’t be automated. The problem is, app development with traditional tools is still a skill that can take many years to reach proficiency. That’s fine for professional programmers, but it’s not a solution for most people. Xojo solves this problem by taking care of the complex details of all the important computing platforms to allow the user to focus on what makes their application unique. The result is that Xojo users can quickly learn app development and create solutions in a tenth of the time it would take with traditional tools. Also, since every platform has its own unique set of functions, the effort involved to reach another platform can easily double. Because Xojo abstracts away these details, the user can deploy their app to multiple platforms without much extra effort at all.

Full Page Ads_SDT019.qxp_Layout 1 12/17/18 5:48 PM Page 37

033,34,36,38,39_SDT019.qxp_Layout 1 12/18/18 1:59 PM Page 38


SD Times

January 2019

A guide to low-code, no-code solutions n AgilePoint NX is a low-code development platform that allows both developers and “citizen programmers” to easily implement and deploy cross-functional/crossorganizational business apps into digital processes across multiple environments and cloud platforms. AgilePoint is the world’s first truly future-proof digital transformation platform. Its “build once and reuse many times” feature defines a new concept in application development. n Alpha Software offers the only unified mobile and web development and deployment platform with distinct “no-code” and “low-code” modes. The platform materially accelerates digital transformation by allowing (LOB) line of business professionals to work in parallel with IT developers, to build the smartphone apps they need themselves and thereby significantly cut the bottleneck traditionally associated with the development of mobile apps. n Altova’s MobileTogether platform is a low-code solution that makes it easy for non-developers to create mobile applications, but provides features powerful enough to support developers. With MobileTogether, users can developer native, datadriven apps for all platforms as well as sophisticated enterprise solutions. n Betty Blocks is proud to be the world’s first truly no-code platform. The company’s goal is that by 2023, anyone can build an application. ‘How can we make it easier’ is the foundation of everything Betty Blocks does. Develop applications rapidly and intuitively through visual modeling; 100 percent in the cloud, multi-device and cross-platform, designed with the flexible UI-builder. And all that without any code. Making it surprisingly easy, fast, fun and impactful. n Capriza ApproveSimple accelerates corporate approvals for immediate business impact. ApproveSimple streamlines approvals from any business system, anywhere you need it: desktop, mobile, and more. Make more informed business decisions, faster. n Caspio is embraced by business develop-



n Appian: Appian provides a software development platform that combines intelligent automation and enterprise low-code development to rapidly deliver powerful business applications. Many of the world’s largest organizations use Appian applications to improve customer experience, achieve operational excellence, and simplify global risk and compliance. To learn more about making no compromise in building enterprise apps faster visit, and join the conversation at #automatemorecodeless. n Dropsource: Dropsource is a uniquely differentiated low-code mobile application development platform for enterprise developers to build powerful, truly native mobile apps. Dropsource provides seamless data integration for enterprise systems, an intelligent drag-and-drop UI, and outputs computer-generated and truly native Swift and Java code. Developers of all skill levels can rapidly prototype, build, deploy, and maintain mobile applications without sacrificing quality or performance, and in a fraction of the time of yesterday’s methods. n Xojo: Xojo is a cross-platform development tool for building native apps for desktop, web, mobile and Raspberry Pi. Xojo applications compile to machine code for greater performance and security. It uses native controls so apps look and feel right on each platform. Since one set of source code can be used to support multiple platforms, development is 10 times faster than traditional tools. Xojo comes with a drag and drop user interface builder and one straightforward programming language for development.

ers for its ease of use, speed to market and enterprise-grade features. Using visual point-and-click tools, business developers can execute the entire application design, development and deployment process, allowing them to rapidly deliver a minimum viable product and continue iterating as the market requires. n Dell Boomi is a provider of cloud integration and workflow automation software that lets organizations connect everything and engage everywhere across any channel, device or platform using Dell Boomi’s industry leading lowcode iPaaS platform. The Boomi unified platform includes Boomi Flow, low-code workflow automation with cloud native integration for building and deploying simple and sophisticated workflows to efficiently drive business. n Kintone is a cloud-based workflow, communication and reporting platform that empowers teams to accomplish more projects and better serve their businesses and communities. With Kintone’s super-flexible, super-functional interface, it’s easy to build, customize, and share powerful enterprise

apps at lightning speed. Ditch the spreadsheets and automate tasks with workflows, see the status of projects in real-time and never forget a thing with notifications and reminders to keep everyone moving in the same direction. And rest assured that with Kintone’s secure platform and granular permission controls your data is always safe. n K2 offers an established platform that excels across mobile, workflow, and data. K2’s core strength is support for building complex apps that incorporate mobile, workflow, and data. The company provides a data-modeling environment that allows developers to create virtual data views that bring multiple systems of record together into a single view. This allows developers to create an abstract view of the data. n When it comes to low-code the Kony AppPlatform is a proven leader and partner of choice to the world’s most trusted brands. The AppPlatform delivers speed without compromise, accelerating development with reusable components and realtime collaboration tools to keep projects on track and team members aligned. A rocksolid centralized code base powers all devices and operating systems, integrating with 100% of the native OS for true native experiences while streamlining support and

033,34,36,38,39_SDT019.qxp_Layout 1 12/18/18 1:59 PM Page 39

minimizing maintenance. Kony recently introduced Progressive Web Apps to its platform bringing the ability to build PWAs to low-code development. n Mendix, the global leader in low-code, is transforming the world of legacy software and application development by bringing business and IT teams together to rapidly and collaboratively build robust and modern applications for the enterprise. The Mendix application development platform directly addresses the tremendous worldwide software developer talent gap, and involves business and IT at the very start and throughout the entire application building and deployment process. Recognized as a “Leader” by top analysts, including Gartner and Forrester, Mendix helps customers digitally transform their organizations and industries by building, managing, and improving apps at unprecedented speed and scale. n Microsoft enables users to create custom business apps with its PowerApps solution. PowerApps features a drag-and-drop, citizen developer-focused solution designed to build apps with the Microsoft Common Data Service. PowerApps can be used with Microsoft Flow, the company’s automated workflow solution, for data integration. Build apps fast with a point-andclick approach to app design. Choose from a large selection of templates or start from a blank canvas. Easily connect your app to data and use Excel-like expressions to easily add logic. Publish your app to the web, iOS, Android, and Windows 10. n Nintex helps enterprises automate, orchestrate, and optimize business processes. With the company’s intelligent process automation (IPA) solutions, IT pros and line of business employees rely on the Nintex Platform to turn their manual or paperbased processes into efficient automated workflows and to create digital forms, mobile apps, and more. n Oracle Autonomous Visual Builder Cloud accelerates development and hosting of engaging web and mobile applications with an intuitive browser-based visual development on the same enterprise-grade cloud platform powering Oracle SaaS Applications. Create business objects, add process automation, integrate external systems and, when needed, leverage standard Javascript to create amazing apps faster.

n OutSystems is the #1 low-code platform for rapid application development. Recognized as a Leader by both Forrester for lowcode platforms and Gartner for mobile application development and high-productivity application-platform-as-a-service, OutSystems is the only solution that combines the power of visual development with advanced mobile capabilities. Because it was designed to make building high-quality enterprise-grade applications easy, thousands of customers trust OutSystems to deliver entire portfolios that easily integrate with existing systems — incredibly fast. n Pegasystems: The Pega low-code application development platform delivers apps faster than traditional approaches. Business and IT collaborate in real-time, using visual models to capture business requirements, quickly iterate and scale apps while ensuring nothing gets lost in translation. Pega automatically generates the application and its documentation audit trail, all leading to a 75 percent reduction in development costs. n Quick Base is the industry’s leading nocode application development platform. Used by more than 6,000 customers — including half of the Fortune 100 — Quick Base seeks to empower users to solve business challenges without compromising IT governance. Forrester also recognized Quick Base as a Leader in its 2017 Low-code Platforms for Business Developers Wave, where it was the only technology to receive differentiated ratings in eight assessment criteria. n Sencha Ext JS provides everything a developer needs to develop data-intensive web applications. The framework includes a powerful library of 115+ pre-tested, integrated, and professionally supported components and tools to simplify and accelerate the web app development process. n WaveMaker provides an enterprise lowcode platform designed to combine the speed or low code with the power of custom code. It features rapid UI development, multi-channel delivery, visual data integration support, customization capabilities, and out-of-the-box application security. Its recent release of WaveMaker 10 focused on developer productivity, with updates to its IDE support, a new studio workspace sync plugin, and new language support. z

January 2019

SD Times


< continued from page 34

also cater to the average person, Xojo’s Perlman explained. While tools can help ease the pain for development teams, the majority of business users who are going to be using it are going to be nontechnical people, Perlman explained. “Tools should enable incremental learning,” he said. “A lot of times, tools overwhelm the users with a number of menus and icons. If you start with a simple approach, and reveal things over time, it helps someone who is maybe not as technical to learn as they go.” Additionally, Dropsource’s Saren warns against being fooled by big-name companies. He explained that a lot of people think if it comes from a popular company, it has more of a probability of sticking around for the long haul. “Be open-minded and willing to look at solutions you might not have historically considered. The traditional and conventional approach is not working anymore, and these kinds of demands require out-of-the-box solutions,” he said. “Don’t make compromises with proprietary solutions that give you inferior results, inferior code or somehow lock you into extensive runtime environments.” Whether we call it digital transformation or not, we will continue to see this space play out over the next couple of years. “Digital transformation is just a constant evolution towards the latest and greatest technologies that the world has come to expect,” said Saren. “It is not just about bringing in new technologies, checking the boxes and saying okay we have a website or a mobile app. You have to continue to innovate.” “The promise of software, which is improving the automation, is going to remain the same. The terminology might change. Today, we call it digital transformation. Tomorrow we might call it augmented reality transformation. It is all going to be with the same purpose: Improving automation, improving business outcomes and ensuring we reach our customers in a better, more successful way,” he said. “The whole premise of low code is to help companies and organizations keep up with the evergrowing pains of change.” z

040_SDT019.qxp_Layout 1 12/17/18 5:32 PM Page 40


SD Times

January 2019


The modern security hero is a developer Altaz Valani is the Research Director at Security Compass.


s software becomes more sophisticated, the need for a security culture in organizations becomes more urgent. However, organizations’ security teams rarely have the necessary resources and expertise to support developers. In fact, the BSIMM 2016 survey indicates that for every 245 software engineers, there is 1 security expert. Not only do organizations lack the resources and expertise, but security professionals lack direct influence over development teams, aside from enforcing policy. Nevertheless, it is the security professionals’ responsibility to improve the security of software, and developers don’t have sufficient incentive to be interested in security practices — which notoriously slow down their workflow. So, despite the need for a security culture in organizations, the currently defined organizational roles and responsibilities aren’t conducive to creating one. To address the security deficit in organizations, one notable response is to establish a Security Champions program. One implementation of such a program designates a member from each development team to be the ‘Security Champion,’ who acts as their security conscience. This person leads all security activities on the development side and plays a major role in facilitating an organization-wide security culture shift. In fact, we’ve already seen proof of this program’s success in major corporations, including Adobe, which has a ‘Belt’ program and Cisco, which has a ‘Security Ninjas’ program. Given the success in implementing these programs, Gartner estimates that by 2021, 35% of enterprises will have a Security Champions program, a significant rise from 10% of enterprises in 2017. So, we might ask ourselves, how can Security Champions fill the existing security gap in organizations to facilitate a cultural shift? They’re the bridge-builder between development and security. The Security Champion builds a relationship between members of the development team and members of the security team. They facilitate all communication between them and help to instill a security conscience in developers. Ultimately, the Security Champion raises awareness about security needs amongst developers to help

To address the security deficit in organizations, one notable response is to establish a Security Champions program.

nurture a security culture. They’re the go-to security expert on the development team. The Security Champion helps to drive security-related improvements within their development team. They assist in executing application security activities, and they ensure that security is integrated into the development process. If developers on their team have securityrelated questions, they can go to their Security Champion to seek guidance. They’re trained to be security experts. Once recruited, the Champions are provided with training materials, including books, written resources, and eLearning courses. They’re also enrolled in instructor-led training, which delves into application security related to the OWASP Top Ten as well as security tool training. They may even receive supplemental training in the form of Lunch and Learns or other events. They’re motivated to be security-minded employees. Developers enter the Security Champion role knowing that it can help them grow in their career. They’re often offered external and internal certifications for each tier of Security Champion achieved. Security Champions in the bigger picture. A software engineer who also acts as a Security Champion is an important asset for development teams, helping to teach other developers about security best practices while ensuring that secure code is deployed. Ultimately, applications that have security built into the early stages of the software development lifecycle will have more secure frameworks and architectural design, as well as a decreased attack surface. This helps to reduce the risk and potential damages that could issue while running critical applications in their production environment. Having a Security Champion in your organization is an easy way to catch vulnerabilities and security defects before the application is released to production, preventing hackers from exploiting the software and protecting the organizations that produce it. In tandem with e-Learning courses and adequate security training platforms, Security Champions programs play a crucial role in shifting the cultural paradigm toward a security-conscious development environment. z

041_SDT019.qxp_Layout 1 12/17/18 5:25 PM Page 41

January 2019

SD Times


Developer or User? S

oftware developers create an intangible product — software — to define the initial configuration of a processor’s memory. The right memory configuration causes a processor to do what it’s supposed to do. We all tend to refer to the text representations of code as ‘software’ — when in fact, this text is just one step in a process of getting from a user need to the right memory configuration. The steps between articulating a need and delivering a completed software module contain many artifacts that represent the software. High-level constructs have had widespread use as software design and documentation tools for a very long time. From functional block diagrams to Unified Modeling Language (UML) and SysML, the idea that ‘a picture is worth a thousand words’ has had its place in software development. But if you really need to know what’s going on, don’t you, like everyone else, want to look at the code? But in some environments, there isn’t any code to look at. For example, think of the software that configures programmable logic controllers (PLCs) in industrial automation systems. The ladder-logic representations of sense-and-act conditions and sequences are the only representation the PLC programmer needs — and this has been a primary way of programming these special purpose computers for many years. And that can be a good thing, for example, enabling creation of development tools that enforce specific ladder-logic design rules for safety-critical systems. Other environments in which software can be generated without handling source code include several where graphical blocks can be associated with code elements. Software engineers connect the blocks to make a model and define the execution and data communication logic between the code elements. Press the button to translate the diagram into software. I remember being mildly surprised when some years ago, the head of development of embedded automotive software explained their team hardly ever used ‘code’ any more, everything was developed with diagrams.

Configure or program? Many simulation systems used in (real) product development implement graphical interfaces.

These allow representation of systems by connecting the diagrams that represent each of the components. The resulting simulations can be complex configurations of simulation software, sometimes integrating multiple independent software. Business intelligence software offers similar capabilities, for example, drag-and-drop data items or whole databases to define data sources to display or correlate. These examples trigger a question — where, if it exists at all, is the boundary between configuration and software development? And if this boundary does exist, does it help or hinder software development? Should we be standardizing this interface, or demolishing it? A developer can ‘use’ an application programming interface (API), but a user needs a tool to help them use an API. Software engineers have forever appreciated the value of reuse. Services and micro-services offering APIs are evidence of this desire to create software which has value in more than one context. A software developer is a ‘user’ of existing elements (for example, through APIs), and an author of the new connections between them. Some developers see opportunities to get more done by building tools. Some of these tools are built to allow users to build the connections between existing software elements. The result can look like a new software system. So, what comes after APIs? Concepts from object brokers, message-based-systems and service registries have something to offer. These approaches don’t guarantee universal connectivity, but can automate the handling of some of the details. This may be enough to allow a user to configure (or should that be develop?) the software they need from components. Or perhaps this should be seen as development automation, handling the details so that developers can concentrate on the creative and conceptual challenges of new software. If it works well enough, someone will find a way to automate use of general-purpose ‘requirements’ to configure existing modules into a new software system. z

Peter Thorne is director at analysis firm Cambashi.

Where, if it exists at all, is the boundary between configuration and software development?


042_SDT019.qxp_Layout 1 12/17/18 5:24 PM Page 42


SD Times

January 2019


David Rubinstein is editor-in-chief of SD Times.

Digital transformation is all about the timing T

he software development industry is moving fast. Too fast, if you ask me, and for many practitioners I’ve spoken within the last year at about a dozen events ranging from development to data to monitoring, to DevOps and testing. The folks I’ve spoken with over lunches, in session rooms and at post-conference socials tell me they attend these events mostly to see what the bigger world outside their organizations is up to. They want to learn about managing container clusters, and continuous improvement, and automated everything, but they say their companies are at most in the exploratory stages. Part of the problem, as I see it, is that the science is far outpacing business, because they have different goals. It’s the software engineers looking to improve on software and systems from a pure technical standpoint, and the analysts who are always looking to define markets around these ideas, and the vendors who emerge from or latch on to these projects with enterprise versions, who tell the business world that they need to adapt and change to survive. Business, though, wants to do business, and with the ever-quickening stream of solutions coming down the pike, technology is becoming — to them — more of a business disruptor than facilitator. First, organizations had to become Agile, because their software iterations were too slow. Then they needed to adopt DevOps, because their competitors were beating them to the punch. Then, they were told they need a complete digital transformation, replete with a move to the cloud, microservices, containers, citizen developers and what they say is only a ‘minor’ culture shift. (Editor’s note: There is no such thing as a ‘minor’ culture shift. Even a move that the business sees as a little tremor can feel like an earthquake to those affected.) I had a conversation with Michael Bushong from Juniper Networks at last month’s Gartner IT Infrastructure, Operations and Cloud Strategies Conference, and he pointed out that organizations making these transformations must stop viewing

There is no such thing as a ‘minor’ culture shift. Even a move that the business sees as a little tremor can feel like an earthquake to those affected.

them as a technology effort. When they do that, Bushong said, “they’ll underestimate the complexity and squander their opportunity.” To reiterate, the point of all this is not to update technology; it’s to get to market faster and remain competitive, or risk losing the business altogether. Often, what that comes down to is timing. Bushong said that “suppliers don’t fail if they’re too slow” to adopt the latest technologies and to enter new markets. “They fail because they’re too early.” A good example of this is a company I met in early 2015, called Pneuron. The company created an integration platform that utilized individual neutrons (to later be known as microservices) stored in a cortex (a container that distributed messaging) without having to do a lot of aggregation and query writing. It was mind-blowing to me, and Gartner described it as “jaw-dropping.” However, since the market around microservices and containers hadn’t quite formed yet, the terminology they were using — neurons, cortex — didn’t align with any other conversation. By the time the market began to form, companies like Docker and CoreOS timed their entry better. So, Pneuron never was able to take advantage of the opportunity, because they were too early. I read they were acquired in February and, like the ark at the end of “Raiders of the Lost Ark,” it will likely disappear into the new company’s larger portfolio. No question, the push to go fast is real. Every time a market drags its feet today, it seems, it becomes another victim of Amazon — which has gone from selling books, to now selling groceries after its Whole Foods acquisition. A headline in the Harvard Business Review claimed, “The Amazon-Whole Foods Deal Means Every Other Retailer’s Three-Year Plan Is Obsolete.” But if you go too fast, and fail to get the culture in place, the tooling integration and the visibility into the entire process in place, you will have spent a lot of time and money and still missed the window of opportunity. Clearly, the saying ‘the early bird catches the worm’ does not always apply when it comes to new technology. On the other hand, ‘slow and steady’ doesn’t always win the race. That’s what our industry is wrestling with today. z

Full Page Ads_SDT019.qxp_Layout 1 12/17/18 5:49 PM Page 43

Full Page Ads_SDT019.qxp_Layout 1 12/17/18 5:44 PM Page 2

Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.