Industry’s First Open Metadata Standard Helps Organizations Better Understand, Manage and Gain Value from Data

Vancouver, BC, Canada – August 27, 2018 – Open Source Summit North America — ODPi, a nonprofit organization accelerating the open ecosystem of big data solutions, today announced Egeria, a new project from ODPi that supports the free flow of metadata between different technologies and vendor offerings. Egeria enables organizations to locate, manage and use their data more effectively.

Last year’s ODPi white paper on “The Year of Enterprise-wide Production Hadoop” found that Data Governance and Security were the biggest blocking factors to enabling enterprises to take big data into true production. Recent data privacy regulations such as GDPR have brought these concerns to the forefront, and enterprises around the globe need a standard for ensuring that data providence and management is clear and consistent across the enterprise. Egeria enables this, as the only open source driven solution designed to set a standard for leveraging metadata in line of business applications, and enabling metadata repositories to federate across the enterprise.

“A consistent view on data across the entire landscape is essential for any organisation that wants to become data driven. Not just where the data is, but also the quality, the ownership, and the full lineage across the entire set of technologies used,” said Ferd Scheepers, chief information architect, ING. “The open metadata standard delivered by Egeria delivers this consistent view across all the technologies, while reducing the cost of metadata capture, and the management challenges of working with various data tool vendors.”

Egeria is built on open standards and delivered via Apache 2.0 open source license. The ODPi Egeria project creates a set of open APIs, types and interchange protocols to allow all metadata repositories to share and exchange metadata. From this common base, it adds governance, discovery and access frameworks for automating the collection, management and use of metadata across an enterprise. The result is an enterprise catalog of data resources that are transparently assessed, governed and used in order to deliver maximum value to the enterprise.

“Egeria’s open source metadata management presents an exciting opportunity to rethink both management and governance of data to provide greater trust and flexibility in how we all share and consume data,” said John Mertic, director of program management, ODPi. “Egeria’s open governance model allows our community and practitioners to develop and evolve the base for use in any offerings and deployments.”

IBM and ING, vendors and end users collaborated on the first Egeria release, which was initially incubated as part of the Apache Atlas project (an open source metadata repository designed for the Apache Hadoop ecosystem). IBM and ING jump-started Egeria with a significant code donation. ODPI members and end-users are actively collaborating to expand the Egeria code base with standard integration points between metadata repositories and line of business tools leveraging data. An Apache Atlas patch is available for immediate use, and an Egeria proof of concept is complete for IBM’s InfoSphere Information Governance Catalog.

“Changing the availability and the quality of metadata will in turn improve the agility of the data scientist, as well as the transparency of the results they produce,” said Mandy Chessell, distinguished engineer and master inventor, IBM. “Egeria simplifies metadata capture and management to create a consistent view of data across all tools an organization may use.”

Egeria Project Objectives

The Egeria project focuses on: Automation, Business Value and Connectivity.

  • Automation — Providing an API for components that capture metadata from data platforms as data sources are created and changed. This metadata is stored in the metadata repository and results in notifications to alert governance and discovery services about the new/changed data source. It provides frameworks and servers to host bespoke components that automate the capture of detailed metadata and the actions necessary to govern data and its related assets.
  • Business Value — Open metadata and governance provides specialized access services and user interfaces for key data roles such as CDO, Data Scientist, Developer, DevOps Operator, Asset Owner, and Applications. This enables metadata to directly support the work of people in the organization. The access services can also be used by tools from different vendors to deliver business value with open metadata.
  • Connectivity — Connectivity enables a peer-to-peer Metadata Highway, offering open metadata exchange, linking and federation between heterogeneous metadata repositories.

“As a leader in advanced analytics, SAS understands the value of full transparency related to data provenance and governance,” said Craig Rubendall, vice president of Platform R&D, SAS. “Egeria and its open approach to metadata management and integration only underscores further the need for metadata standards to promote responsible data exchange across varied technology environments.”

Additional Resources

About ODPi

ODPi is a nonprofit organization committed to simplification and standardization of the big data ecosystem. As a shared industry effort, ODPi members represent big data technology, solution provider and end user organizations focused on promoting and advancing the state of big data technologies for the enterprise. For more information about ODPi, please visit: http://www.ODPi.org

###

 

By Ole Lensmar, Chairperson of the OpenAPI Initiative

Today, eBay announced that they are leveraging the OpenAPI Specification (OAS) for all of its RESTful public APIs. With OpenAPI, developers can download an eBay OpenAPI contract, generate code and successfully call an eBay API in minutes. APIs play a critical role in eBay’s Developer Ecosystem helping the company build and deliver the best experiences to its buyers and sellers.

“The move to using the OpenAPI Specification was a unanimous choice given our needs and knowledge of the incredible ecosystem of developers that surround OpenAPI,” said Gail Frederick, GM of eBay Portland and VP Developer Ecosystem at eBay. “The OpenAPI Specification is the de facto standard for describing APIs and plays a critical role in the new microservices-based architecture at eBay.”

As a member and chairperson of the of the OpenAPI Initiative, I see more and more companies moving to distributed and microservice-based architectures as the need to build quality experiences for users and ship products or services to market faster is a linchpin to any business’ success. Technologies and tools created to support this transition are largely built from open collaboration, spanning application development technologies like Node.js to container orchestration like Kubernetes. Since APIs are the “glue” between distributed components, the OAS standard plays a central part in this transition.

This was definitely the case with eBay. As eBay transitioned from a monolithic and centralized architecture to a distributed microservice architecture, the company needed to evolve the way service contracts were explored, tested, published, and integrated with API specifications.

The company had a set of needs for this transition:

API contracts would need to meet the needs of seamless exploration and integration across a diverse technology stack, be industry standard, and be feature rich to complement our Technical Standards and governance models necessitated the exploration for a new specification. The primary criteria was a specification that was both human and machine readable, language agnostic, vendor-neutral, and open source.
Shekhar Banerjee, Senior MTS Architect, eBay

OAS became the unanimous choice due to its tooling support, fully customizable stack, code-first and contract-first approaches to API development, and most importantly because OpenAPI continues to evolve as a standard led by open collaboration from the OpenAPI Initiative. The move to OAS furthers eBay’s mission to its Developer Ecosystem to promote developer efficiency and productivity with no more SDKs and no more hours spent writing API client code.

eBay has been a member of the OpenAPI Initiative since August 2017 and one of the first in the industry to publish contracts based on OpenAPI 3.0 specification. We are very excited to see eBay’s continued support of our consortium, as well as other open collaboration projects, including the Cloud Native Computing Foundation (CNCF). We look forward to sharing more around eBay’s success with OAS as well as the many users and members that make up our ecosystem during API Strategy & Practice Conference happening September 24 – 26 in Nashville, Tennessee. Learn more about this conference here, and keep up-to-date with news coming out of the OpenAPI Initiative here.

 

CIP aims to create an interoperable open source software platform that is secure, reliable and sustainable for more than 10 years

TOKYO, JAPAN – June 20, 2018 – The Civil Infrastructure Platform (CIP) Project, which aims to provide a base layer of industrial grade open source software components, tools and methods to enable long-term management of critical systems, today announced a new collaboration with the Debian LTS Initiative to use Debian, the universal operating system that is available to developers & companies as free, open source software. This collaboration builds upon CIP’s mission of creating an open source framework that provides the software foundation needed to deliver essential services for civil infrastructure and economic development on a global scale.

In this new partnership, CIP will specifically help with Debian Long Term Support (LTS), which aims to extend the lifetime of all Debian stable releases to more than 5 years. CIP will work with Freexian, a multi-faceted services company that is leading the effort for Debian LTS, to maximize interoperability, security and LTS for open source software for embedded systems.

CIP will contribute in several ways, including:

  • Funding for Debian LTS activities
  • Working toward interoperability by harmonizing software and other elements
  • Collaborating on common elements

“CIP’s mission of creating industrial grade open source software aligns with our goal of developing a free and universal operating system,” said Chris Lamb, Debian Project Leader. “We are excited about this collaboration as well as the CIP’s support of the Debian LTS project which aims to extend the support lifetime to more than five years. Together, we are committed to long term support for our users and laying the ‘foundation’ for the cities of the future.”

CIP has had a long history of working with Debian as most control systems for transportation, power plants, healthcare and telecommunications run on Debian embedded systems.

Hosted by The Linux Foundation, CIP aims to speed implementation of Linux-based civil infrastructure systems, build upon existing open source foundations and expertise, establish de facto standards by providing a base layer reference implementation, and contribute to and influence upstream projects regarding industrial needs. This collaboration with Debian will help CIP get one step closer to achieving their goals of providing long term support for critical systems through industrial grade software and a universal operating system.

A Growing Ecosystem

In addition to the new collaboration with Debian, CIP also welcomes Cybertrust Japan Co, Ltd. as a new Silver member. By joining CIP, Cybertrust, a company that supplies enterprise Linux operating systems, advances its commitment to building secure and reliable embedded equipment and systems.

“Linux industrial or automotive-grade embedded systems are exposed to serious security threats and our customers expect long term Linux security patches,” said Tatsuo Ito, Vice President, and CTO for Cybertrust. “CIP has this expertise and shares the same goals as we do. We believe that together, we can address these critical issues.”

Cybertrust joins other industry leaders, such as Codethink, Hitachi, Moxa, Plat’Home, Renesas, Siemens and Toshiba, in their work to create a reliable and secure Linux-based embedded software platform that is sustainable for decades to come.

“The CIP Project continues to achieve milestones to build an interoperable open source platform that is secure, reliable and sustainable for more than 10 years,” said Urs Gleim, Head of the Central Smart Embedded Systems Group at Siemens and CIP Governing Board Chair. “We are thrilled that Cybertrust has joined CIP and will provide expertise in security and digital authentication based on Server Linux Distributor (Asianux) and their OTA implementations.”

Open Source Summit Japan

CIP will be at The Linux Foundation’s Open Source Summit Japan from June 20 – 22, 2018. The project will have a booth in the sponsor showcase and interactive demos from Hitachi, Plat’home and Renesas. CIP leaders will also be on-site to answer questions, discuss the importance of industrial grade open source software and how it impacts the city of the future. Additionally, CIP has two speaking sessions including a CIP introduction and overview for the CIP Kernel Maintenance. For more details about those sessions, click here.

Additional CIP Resources:

About CIP

The Civil Infrastructure Platform (CIP) is an open source project hosted by The Linux Foundation. The project is focused on establishing an open source base layer of industrial grade software to enable the use and implementation of reusable software building blocks that meet the safety, reliability and other requirements of industrial and civil infrastructure. For additional information, visit https://www.cip-project.org/.

###

open mainframe

To learn more about open source and mainframe, join us May 15 at 1:00 pm ET for a webinar led by Open Mainframe Project members Steven Dickens of IBM, Len Santalucia of Vicom Infinity, and Mike Riggs of The Supreme Court of Virginia.

When I mention the word “mainframe” to someone, the natural response is colored by a view of an architecture of days gone by — perhaps even invoking a memory of the Epcot Spaceship Earth ride. This is the heritage of mainframe, but it is certainly not its present state.

From the days of the System/360 in the mid 1960s through to the modern mainframe of the z14, the systems have been designed along four guiding principles of security, availability, performance, and scalability. This is exactly why mainframes are entrenched in the industries where those principles are top level requirements — think banking, insurance, healthcare, transportation, government, and retail. You can’t go a single day without being impacted by a mainframe — whether that’s getting a paycheck, shopping in a store, going to the doctor, or taking a trip.

What is often a surprise to people is how massive open source is on mainframe. Ninety percent of mainframe customers leverage Linux on their mainframe, with broad support across all the top Linux distributions along with a growing number of community distributions. Key open source applications such as MongoDB, Hyperledger, Docker, and PostgreSQL thrive on the architecture and are actively used in production. And DevOps culture is strong on mainframe, with tools such as Chef, Kubernetes, and OpenStack used for managing mainframe infrastructure alongside cloud and distributed.

Learn more

You can learn more about open source and mainframe, both the history along with the current and future states of open source on mainframe, in our upcoming presentation. Join us May 15 at 1:00pm ET for a session led by Open Mainframe Project members Steven Dickens of IBM, Len Santalucia of Vicom Infinity, and Mike Riggs of The Supreme Court of Virginia.

In the meantime, check out our podcast series “I Am A Mainframer” on both iTunes and Stitcher to learn more about the people who work with mainframe and what they see the future of mainframe to be.

Calm technology

By 2020, 50 billion devices will be online. That projection was made by researchers at Cisco, and it was a key point in Amber Case’s Embedded Linux Conference keynote address, titled “Calm Technology: Design for the Next 50 Years” which is now available for replay.

Case, Author and Fellow at Harvard University’s Berkman Klein Center, referred to the “Dystopian Kitchen of the Future” as she discussed so-called smart devices that are invading our homes and lives, when the way they are implemented is not always so smart. “Half of it is hackable,” she said. “I can imagine your teapot getting hacked and someone gets away with your password. All of this just increases the surface area for attack. I don’t know about you, but I don’t want to have to be a system administrator just to live in my own home.”

Support and Recede

Case also discussed the era of “interruptive technology.” “It’s not just that we are getting text messages and robotic notifications all the time, but we are dealing with bad battery life, disconnected networks and servers that go down,” she said. “How do we design technology for sub-optimal situations instead of the perfect situations that we design for in the lab?”

“What we need is calm technology,” she noted, “where the tech recedes into the background and supports us, amplifying our humanness. The only time a technology understands you the first time is in Star Trek or in films, where they can do 40 takes. Films have helped give us unrealistic expectations about how our technology understands us. We don’t even understand ourselves, not to mention the person standing next to us. How can technology understand us better than that?”

Case noted that the age of calm technology was referenced long ago at Xerox PARC, by early ubiquitous computing researchers, who paved the way for the Internet of Things (IoT). “What matters is not technology itself, but its relationship to us,” they wrote.

7 Axioms

She cited this quote from Xerox researcher Mark Weiser: “A good tool is an invisible tool. By invisible, we mean that the tool does not intrude on your consciousness; you focus on the task, not the tool.”

Case supplied some ordered axioms for developing calm technology:

  1.    Technology shouldn’t require all of our attention, just some of it, and only when necessary.
  2.    Technology should empower the periphery.
  3.    Technology should inform and calm.
  4.    Technology should amplify the best of technology and the best of humanity.
  5.    Technology can communicate, but it doesn’t need to speak.
  6.    Technology should consider social norms.
  7.    The right amount of technology is the minimum amount to solve the problem.

In summing up, Case said that calm technology allows people to “accomplish the same goal with the least amount of mental cost.” In addition to her presentation at the Embedded Linux Conference, Case also maintains a website on calm technology, which offers related papers, exercises and more.

Watch the complete presentation below:

Enables Server Setup and Boot with a Linux Kernel

The Linux Foundation is pleased to welcome LinuxBoot to our family of open source projects and to support the growth of the project community. LinuxBoot looks to improve system boot performance and reliability by replacing some firmware functionality with a Linux kernel and runtime.

Firmware has always had a simple purpose: to boot the OS. Achieving that has become much more difficult due to increasing complexity of both hardware and deployment. Firmware often must set up many components in the system, interface with more varieties of boot media, including high-speed storage and networking interfaces, and support advanced protocols and security features.

LinuxBoot addresses the often slow, often error-prone, obscured code that executes these steps with a Linux kernel. The result is a system that boots in a fraction of the time of a typical system, and with greater reliability.

This matters in data centers providing cloud services. A data center might have tens of thousands of servers, and even a small failure rate adds up to expensive repairs. LinuxBoot enables organizations to improve operational aspects such as debugging and remediation, as well as functional aspects like powering machines on or off rapidly for elastic loads.

Speed and reliability of the boot process can also be a problem in consumer devices and industrial devices. For IoT, devices in the field may be tough to reach and a boot failure can render a device useless for the customer and even cause safety issues in critical systems.

The LinuxBoot model brings key advantages for users across the broad spectrum of embedded, mobile, and server platforms. Leveraging the massive scale of development of Linux in the boot process gives the user control and support that can’t be achieved any other way.

The technique of using Linux to boot Linux has been common since the early 2000s in supercomputers, consumer electronics, military applications, and many other systems. The LinuxBoot initiative will further refine it so it can be more easily developed and deployed by a broader range of users, from individuals to data center-scale companies.

Organizations involved in LinuxBoot include Google, Facebook, Horizon Computing Solutions, and Two Sigma. The LinuxBoot community welcomes newcomers and invites people to get involved with the project at any level.

To learn more, visit https://www.linuxboot.org/.

“Nobody quite expected the strategists at Redmond to come forward with a direct and open ‘Microsoft Loves Linux’ statement, but they did and it was stated and it’s now officially ‘a thing’ that all the tech industry has become accustomed to.”

Read more at Forbes:

https://www.forbes.com/sites/adrianbridgwater/2018/01/15/linux-foundation-shares-some-love-back-for-microsoft-azure/#762f53382330

New online course will bring Azure pros up to speed with Linux, and vice versa

SAN FRANCISCO, January 11, 2018The Linux Foundation, the nonprofit organization enabling mass innovation through open source, today announced the availability of a new training course, LFS205 – Administering Linux on Azure.

A large number of the virtual machines running in Azure are utilizing the Linux operating system. Both Linux and Azure professionals should make sure they know how to manage Linux workloads in an Azure environment as this trend is likely to continue. LFS205 provides an introduction to managing Linux on Azure. Whether someone is a Linux professional who wants to learn more about working on Azure, or an Azure professional that needs to understand how to work with Linux in Azure, this course will provide the requisite knowledge.

John Gossman, Distinguished Engineer, Microsoft Azure, and Linux Foundation Board Member commented: “With over 40 percent of VMs on Azure now Linux, we are working closely with The Linux Foundation on a Linux on Azure course to make sure customers currently using Linux on Azure–and those who want to–have the tools and knowledge they need to run their enterprise workloads on our cloud. We look forward to continued collaboration with The Linux Foundation to continue to deliver trainings to make customers’ lives easier.”

“As shown by The Linux Foundation and Dice’s Open Source Jobs Report, cloud computing skills are by far the most in demand by employers,” said Linux Foundation General Manager for Training & Certification, Clyde Seepersad. “This shouldn’t be a surprise to anyone, as the world today is run in the cloud. Azure is one of the most popular public clouds, and a huge portion of its instances run on Linux. That’s why we feel this new course is essential to give Azure professionals the Linux skills they need, give Linux professionals the Azure skills they need, and train new professionals to ensure industry has the talent it needs to meet the growing demand for Linux on Azure.”

The course starts with an introduction to Linux and Azure, after which students will learn more about advanced Linux features and how they are managed in an Azure environment. Next, the course goes into information about managing containers, either in Linux or with the open source container technology that is integrated in Azure. After that, LFS205 covers how to deploy virtual machines in Azure, discussing different deployment scenarios. Once the VMs are available in Azure, students will need to know how to manage them in an efficient way, which is covered next. The last part of this course teaches how to troubleshoot Linux in Azure, and to monitor Linux in Azure using different open source tools.

Students can expect to learn about:

  • Advanced Linux features and how they are managed in an Azure environment
  • Managing containers
  • Deploying virtual machines in Azure, and managing them
  • Monitoring and troubleshooting Linux in Azure

LFS205 is taught by Sander van Vugt, a Linux professional living in the Netherlands and working for customers around the globe. Sander is an author of many Linux-related video courses and books, and instructor, as well as course developer for The Linux Foundation. He is also a managing partner of ITGilde, a large co-operative in which about a hundred independent Linux professionals in the Netherlands have joined forces.

The course is available to begin immediately. The $299 course fee provides unlimited access to the course for one year to all content and labs. Interested individuals may enroll here.

About The Linux Foundation

The Linux Foundation is the organization of choice for the world’s top developers and companies to build ecosystems that accelerate open technology development and commercial adoption. Together with the worldwide open source community, it is solving the hardest technology problems by creating the largest shared technology investment in history. Founded in 2000, The Linux Foundation today provides tools, training and events to scale any open source project, which together deliver an economic impact not achievable by any one company. More information can be found at www.linuxfoundation.org.

The Linux Foundation has registered trademarks and uses trademarks. For a list of trademarks of The Linux Foundation, please see our trademark usage page: https://www.linuxfoundation.org/trademark-usage.

Linux is a registered trademark of Linus Torvalds.

# # #

New online course will bring Azure pros up to speed with Linux, and vice versa

SAN FRANCISCO, January 11, 2018The Linux Foundation, the nonprofit organization enabling mass innovation through open source, today announced the availability of a new training course, LFS205 – Administering Linux on Azure.

A large number of the virtual machines running in Azure are utilizing the Linux operating system. Both Linux and Azure professionals should make sure they know how to manage Linux workloads in an Azure environment as this trend is likely to continue. LFS205 provides an introduction to managing Linux on Azure. Whether someone is a Linux professional who wants to learn more about working on Azure, or an Azure professional that needs to understand how to work with Linux in Azure, this course will provide the requisite knowledge.

John Gossman, Distinguished Engineer, Microsoft Azure, and Linux Foundation Board Member commented: “With over 40 percent of VMs on Azure now Linux, we are working closely with The Linux Foundation on a Linux on Azure course to make sure customers currently using Linux on Azure–and those who want to–have the tools and knowledge they need to run their enterprise workloads on our cloud. We look forward to continued collaboration with The Linux Foundation to continue to deliver trainings to make customers’ lives easier.”

“As shown by The Linux Foundation and Dice’s Open Source Jobs Report, cloud computing skills are by far the most in demand by employers,” said Linux Foundation General Manager for Training & Certification, Clyde Seepersad. “This shouldn’t be a surprise to anyone, as the world today is run in the cloud. Azure is one of the most popular public clouds, and a huge portion of its instances run on Linux. That’s why we feel this new course is essential to give Azure professionals the Linux skills they need, give Linux professionals the Azure skills they need, and train new professionals to ensure industry has the talent it needs to meet the growing demand for Linux on Azure.”

The course starts with an introduction to Linux and Azure, after which students will learn more about advanced Linux features and how they are managed in an Azure environment. Next, the course goes into information about managing containers, either in Linux or with the open source container technology that is integrated in Azure. After that, LFS205 covers how to deploy virtual machines in Azure, discussing different deployment scenarios. Once the VMs are available in Azure, students will need to know how to manage them in an efficient way, which is covered next. The last part of this course teaches how to troubleshoot Linux in Azure, and to monitor Linux in Azure using different open source tools.

Students can expect to learn about:

  • Advanced Linux features and how they are managed in an Azure environment
  • Managing containers
  • Deploying virtual machines in Azure, and managing them
  • Monitoring and troubleshooting Linux in Azure

LFS205 is taught by Sander van Vugt, a Linux professional living in the Netherlands and working for customers around the globe. Sander is an author of many Linux-related video courses and books, and instructor, as well as course developer for The Linux Foundation. He is also a managing partner of ITGilde, a large co-operative in which about a hundred independent Linux professionals in the Netherlands have joined forces.

The course is available to begin immediately. The $299 course fee provides unlimited access to the course for one year to all content and labs. Interested individuals may enroll here.

About The Linux Foundation

The Linux Foundation is the organization of choice for the world’s top developers and companies to build ecosystems that accelerate open technology development and commercial adoption. Together with the worldwide open source community, it is solving the hardest technology problems by creating the largest shared technology investment in history. Founded in 2000, The Linux Foundation today provides tools, training and events to scale any open source project, which together deliver an economic impact not achievable by any one company. More information can be found at www.linuxfoundation.org.

The Linux Foundation has registered trademarks and uses trademarks. For a list of trademarks of The Linux Foundation, please see our trademark usage page: https://www.linuxfoundation.org/trademark-usage.

Linux is a registered trademark of Linus Torvalds.

# # #