April, 2017

In our third and final post on “why not use PDM for all CM requirements” we will discuss the three most common strategies over the years for providing configuration management capabilities to users across the OEM enterprise, as well as throughout the supply and service chain. These approaches are to:

1) Repurpose, extend, and customize Engineering PDM software to provide the CM functionality desired for users in or out of engineering.

2) Invest in, implement, and embrace Enterprise PLM solutions and product data IT architectures that deliver CM capabilities from one or more PLM components.

3) Use rapidly deployable, industry specific, Best-of-Class CM tools and applications that work in a federated PLM platform with whatever CAD, PDM or PLM solutions may be used now or in the future, which will inevitably change.

While each of these approaches to providing CM were all valid and preferred at one time, they can resemble an archaeological dig when viewed with fresh eyes. To better understand the context and differences of these strategy options it’s helpful to look at a very brief history of PDM and PLM.

PDM in the 1980s – 1990s

Some of us can no doubt recall developing, implementing, or using the first generation of PDM tools that evolved in the 80s. These standalone PDM software products provided basic CAD data management, file vaulting and security, BOM management, classification and part management, document and version control, and other functionality narrowly focused on the needs of the CAD user and design engineer.

Over a span of the next decade PDM capabilities rapidly expanded to include workflow management, project management, change and configuration management, enhanced visualization, and collaboration and interoperability, among other functions such as manufacturing support. While PDM solutions were for the most part still CAD-centric, they rarely supported equally well multi-CAD environments found in the supply and service chain. The principle user driving requirements remained the CAD designer or engineer. Not surprising, most of the benefits and value of PDM accrued to product development and engineering who thus owned the budget and authority to set priorities.

Early generation PDM solutions often had many modules that could be bolted on at additional cost. Some were plug-and-play, while others required significant customization and integration. These additional modules included rudimentary CM capabilities with substantial programming required to make them advanced enough to support standards of designated industries or requirements of specific users.

Since PDM was largely an engineering initiative, the support of CM requirements and users upstream in the supply chain or downstream within service partners was rarely a top priority. The result was that PDM software providers listened to their engineering customers, and their customers did not always ask for the functionality and usability needed by CM practitioners outside of product development. This explains why most PDM products even today do not provide the deep CM functionality referenced in our March CMsights post.

PLM in the 2000s

During this next period PDM solutions became so expansive in capabilities – and expensive in investment – that the term PLM evolved as a better description. Managing, integrating, and synchronizing the multiple software products, data sets, and workflow processes in an alphabet soup of applications – including CAD, CAE, CAM, PDM, CM, EDA, EDM, PM, Viz, and DMfg to name a few ­–­ became standard features of “enterprise” PLM.

Monolithic enterprise PLM solution providers emerged with proprietary stacks of tightly integrated applications using sophisticated multi-tiered architectures. Mergers and acquisitions were rampant as large providers swallowed up independent software vendors, along with new technologies, to complete their application portfolio to “own” a customer. The resulting complexity of enterprise PLM solutions was illustrated by vendor price books and customer proposals detailing all the software modules, licensing permutations, and deployment scenarios that could run hundreds of pages in length and were only comprehensible to a few product gurus.

On the very positive side, no longer were product data authors and consumers limited to just engineering, but were now found throughout the enterprise and supply chain over the lifecycle of products and systems.  Whether users knew it or not they were being migrated from an engineering-centric PDM schema to an enterprise PLM strategy as part of a much bolder albeit riskier product data architecture. Enterprise-wide PLM strategies often took years to specify requirements, evaluate solutions, define roadmaps, justify capital budget, deploy, integrate, migrate, train, maintain, and upgrade. The cost to perform all this was often many thousands of man-hours and millions of dollars to the delight of management consultancies and system integrators.

As part of an enterprise PLM vision, CM capabilities were often featured prominently as the central justification for the overall investment, especially since the benefits and risks from product configuration management done well or poorly were distributed across the entire enterprise and lifecycle. Yet, CM was often not implemented until later phases of the PLM roadmap. All of the baseline prerequisites for an invasive product data backbone served by a massive central repository still had to be implemented before CM functions could be tackled. While a future with PLM was being sold, it was like going back to the 80s using PDM to get started.

With the scale of deployment complexity now at an enterprise level instead of departmental, it is no surprise that executing an enterprise strategy for PLM could consume years. It was not uncommon by the time CM workflows were functional that user requirements, product lines, management priorities, funding sources, and solution partners had all changed, often drastically. Many promising big-bang deployments on paper became disappointing long-whimpers in execution.  Users of configuration data outside of engineering in logistics, procurement, quality, test, and field service were often left to make-do with what they had already provisioned or fabricated on their own.

Product Innovation Platforms in the 2010s

An inconvenient truth is that the PLM industry which promoted itself by selling change and transforming enterprises, did not always practice what it preached by offering robust solutions and resilient architectures that could easily evolve as user requirements changed, technologies advanced, application software revolved, and deployment options multiplied such as off-premise SaaS. Few enterprise PLM customers, all the smarter and wiser after investing in previous generations of PDM and PLM, want to go through yet another cycle of rip, tear, and abandonment unless they can be assured the next strategy is far more survivable and sustainable.

More recently the term Product Innovation Platform (PIP) has emerged to describe a different vision for how the dozens of technical applications which enable a PLM strategy, including PDM and CM, can be delivered in a more robust and resilient manner that is inherently affordable in the near term and sustainable over the long term. This strategy often uses a federated architecture created from a distributed interoperating portfolio of best-in-class applications often comprised of rapidly deployable industry-configured software solutions.

In this view of PLM, the utopian fantasy of having one benevolent enterprise solution provider that satisfies all PLM requirements is finally surrendered, and buried as it should be. Instead, the focus is building an adaptable and nimble vendor agnostic architecture that can elastically accommodate ever-changing user requirements, the relentless pace of innovation, disruptive game-changing technologies, and the eventual appification of PLM.

The Preferred Strategy for Delivering CM into the Future

In a vendor agnostic PIP strategy for providing CM, users are now freed from the myth that a single PLM solution suite or one behemoth solution provider can satisfy all requirements. Industrial customers are able to select and deploy a portfolio of more affordable, right-sized, best-in-class industry-focused applications that interoperate in a federated ecosystem less dependent on an omnipresent master. The systemic expense, complexity, and fragility of an enterprise PLM approach is eliminated. The emphasis shifts from a philosophy of owning and controlling to sharing and harmonizing product data for more open collaboration across the OEM as well as the supply and service chains.

When CMstat is asked how best to examine CM solutions when viewed as part of a new product information and innovation platform strategy, we typically recommend that the following ten “-ility” characteristics be carefully evaluated: functionality, usability, deployability, adaptability, extensibility, interoperability, security, scalability, affordability, and sustainability.  While evaluating each of these attributes still requires diligence, the process can now be undertaken without being incapacitated by legacy thinking from the 1990s on engineering PDM or from the 2000s on enterprise PLM.

In a future CMsights post we will examine the characteristics of CM applications to support a product innovation platform strategy that works not just for innovation in product design but also enables innovation in operating efficiencies of suppliers and service chain partners.

Until then learn more by downloading CMstat’s whitepaper “Nimble Configuration Management for the Contract Supply and Service Chain.”

Or CLICK HERE to see how CMstat’s product PDMplus offers a rapidly deployable, instantly usable, and immediately affordable CM solution.