"Both the move to BPM and the move to SOA have brought to light discrepancies between the way that the same data is defined in different applications," said Robin Bloor, a partner with Newton, Mass.-based Hurwitz and Associates LLC and founder of Bloor Research. "This has undoubtedly sparked an interest in data modeling."
The larger trends of SOA and BPM aren't going away and in fact are only going to become more prominent, according to Bloor. They both require the reconciliation of data definitions between systems as well as investment in master data management.
There is also the fact that there are now initiatives in many industry sectors to harmonize on a particular "common data model" for data interchange between companies, Bloor explained.
"This is a trend that will increase over time rather than diminish," he said.
"An effective data model is business-focused and contains business content," the report says. "It needs to be flexible so that it can be modified to reflect new business approaches and needs, and it must anticipate the impact of change."
Ease of access and avoiding data redundancy in a transactional system are also key aspects of a good data model.
"These two drivers in many cases work against each other, and so it becomes the modeler's challenge to leverage one against the other," said Ben Ettlinger, lead data modeler in the IT division of the New York Power Authority. "You can't end up with a physical database that is too difficult for programmers to access, but you don't want the price of your widget in three places which are out of sync, with different prices and different measures."
Discrepancies in the way data is defined may not seem like a big deal, but Ettlinger recalled one example that proved otherwise.
"My favorite example is when one of NASA's early Mars landers crashed because one development team applied inches to a number and another team used the same number with centimeters," he said.
Besides this catastrophic case, other common data discrepancy problems include skyrocketing maintenance costs, merger and acquisition problems, and IT being less able to respond to business needs, according to Dan Linstedt, co-founder and CIO of Golden, Colo.-based Genesee Academy, a data modeling training center, and inventor of the Data Vault data modeling method.
"The more adaptations you make to an architecture that was not meant to be that way, the harder it is to maintain, the higher the cost, the longer it takes," Linstedt said. "Eventually these things just fail in the business users' eyes."
Tools to help
The role of a data modeling tool is to help in the design of a database schema or in the design for UML (the Unified Modeling Language), Bloor explained, dividing the tools into two fundamental camps.
"You can have other design targets, such as XML or a repository or simply a metadata model," Bloor said. "Also, some modeling tools target specific databases, such as Oracle Designer, which targets Oracle. What this means is that you cannot generalize about such tools."
There are more than 40 data modeling tools out there, according to Ettlinger, citing Stamford, Conn.-based Gartner Inc. Some of the more popular providers include CA, Embarcadero Technologies, Sybase, Oracle and Popkin.
"The market is mature, but I can really only speak for CA's ERwin [Data Modeler] which is the tool I use," said Ettlinger, who is also the president of the New York ERwin User Group. "It is an excellent tool, with a very responsive and cooperative vendor development lab."
Even though most of the current crop of data modeling tools are more than 10 years old, they are generally effective and have satisfied users, according to Bloor, who also singled out ERwin as an excellent option for designing database schemas.
"The tools do what they have to do and that's help design databases," Ettlinger concurred. "Sometimes people may be unhappy with individual features of the tools. In almost 15 years using the ERwin, we have rarely had any applications issues because of the databases we built."
Of course, current tools do have their limitations.
"If you wanted a modeling tool that did something difficult, like gather all the database schemas and data files used within the organization and help you transform them into a common corporate data model -- I know of nothing that does that," Bloor said.
Current models and warehouses also don't do well with temporally based data, according to Linstedt, who doesn't anticipate any revolutionary data modeling tools coming into the pipeline for two to five years.
However, effective data modeling takes more than tools, according to Linstedt. It starts with the idea that the model needs to serve IT -- in terms of agility, scalability and flexibility -- and be an enabler for IT to serve the business.
"A lot of people forget that a data modeling architecture can make or break your entire business effort," Linstedt said. "And the model is absolutely critical because it's a reflection of how you run your business and how you capture and use your information assets."