June 20, 2017
Roadblocks on the Path to Interoperability
Imagine a world where all the important data related to your health flowed easily between systems and rolled directly to your health care provider. They could set automatic alerts if your heart rate was too high, or if your blood pressure was above safe levels, or even if you’ve had one too many drinks and shouldn’t drive. Wouldn’t that be great?
In healthcare, interoperability means that different systems and applications communicate and exchange data freely between parties—from hospital to pharmacy to patient.
Technology is already trending in that direction—iPhone and FitBit track activity levels and Sleep Number beds monitor your sleep habits—but some major roadblocks still exist on the path to full interoperability nirvana.
Here are a few sticking points:
Because of open standards like XML, APIs and SDKs, vendors resist integration for fear of losing their status as the primary solution. Internal teams can also be unwilling to do the work necessary to rebuild their ecosystems. And business owners are wary of expensive structural overhauls. Simply put, change is scary for some people. “If it ain’t broke, don’t fix it.”
There are many different metadata standards to choose from and each requires customization to be used effectively. Managing this metadata across platforms without standardization is tricky. Differences between hardware, servers (UNIX, Linux, Mac, Win), and infrastructure further complicate matters. For example, your designers might work on Macs while your field engineers use a different system entirely.
Regardless of industry, when digital information is created, it ends up being used elsewhere for forecasting, creative use, mapping, etc. Do the people who build the system to support this process understand downstream usage? If you’re working on improving a process, has it been functioning in the arena long enough to reach maturity? Continual refinement of one process might slow the adoption of a new one. Trying to roll out a new process AND new tools AND integrate them all simultaneously is a recipe for disaster.
Now let’s take a quick look at some challenges in new industries where DAM is emerging:
Do you follow your own best practices or adhere to government standards? Are there overlapping interests between government and other bodies? Does this overlap affect control of your assets? In the digital age, what defines an asset, anyway? For example, is it a photo of a bridge…or the bridge itself?
New formats are emerging that make it difficult to standardize across media and downstream consumption models. HD video is more of a challenge to store and catalog than standard video. That’s why the IPTC developed its Video Metadata Hub to define metadata fields by name and clearly describe semantics and basic data type. Systems of origin that produce these artifacts (like HD cameras) are not accustomed to connectivity with downstream system—or come from the same vendors and don’t integrate with 3rd parties. I predict that more vendors will adopt open and standard formats. Look at GoPro, which uses the H.264 codec or .MP4 format. This openness has likely helped them own the market in this arena.
Today’s farmer/commodity producer is much more technologically savvy than in previous generations. Old tractors were a motor and plough; today they capture more data than the first spaceships. Geo-Spatial is a player here too, but standards for how to manage this info are still being developed. Currently, taking the information from crop sensors and plugging it into 3rd-party systems for optimization, supply chain management, forecasting, and pricing is largely a manual process. Without a regulatory or governmental intervention to force the issue, changes here will most likely be slow and incremental.
More new formats and asset-creation devices here, too. Most real estate agents probably don’t have an established practice for tagging and cataloging photographs. Commercial real estate uses files as varied as architectural CAD, rich media and graphics files, After Effects, or other ADBE formats.
Processes in manufacturing are best understood by their maturity in models like Kanban, ISO, and Six Sigma. These processes focus on quality control without much consideration for next steps. Systems are already in place to manage downtime, but there’s a lot more data to be leveraged. Applying tracking mechanisms during the manufacturing process may be able to assist with inventory control and supply chain challenges. The next generation of leaders in manufacturing will rely on data and systemic and improvements but change will come slowly.
In summary, interoperability is our future, but older industries might be slower to adapt. I’ll leave you with some recent quotes about interoperability from professionals in the field:
“Interoperability is a continuum. As we expand across the spectrum of data and information exchange, there are more places where we can make this process even more efficient.” -Steve Sclhiesman, Department of Veterans Affairs
“We are working to unlock healthcare data and information so that providers are better informed and patients can access their healthcare information, making them empowered, active participants in their own care.” -Sylvia Burwell, Department of Health and Human Services
“We’re deadly serious about interoperability. We’ll begin initiatives in collaboration with physicians and consumers toward pointing technology to fill critical use cases like closing referral loops and engaging a patient in her care.” -Andrew Slavitt, Centers for Medicare and Medicaid Services
Read more articles written by our DAM expert, Toby Martin! Take a look at one of his latest posts: Learn the Difference Between Big Data & Metadata or take your digital asset management knowledge to the next level by downloading our DAM Best Practices Guide. The guide is packed full of tips on streamlining workflow, metadata best practices, and more: