The explosion in information technologytools and capability has accelerated. Our information technology environment is moving at absolute light-speed. Yet, the back of data integration is broken. 90% of customers are using 20-year old ETL technology designed to move batch files into the data warehouse at 1:00 am. It is a manual process. Worst of all, is that some vendors have even put this 20-year old technology in their newly minted 1-year old cloud. Who wants 20-year old technology in a 1-year old cloud?
To add insult to injury, data quality is … “another product.” I humbly submit that without resolving data quality issues and providing for their ongoing maintenance, you cannot integrate even two applications, load your data warehouse, consider a future Virtual MDM deployment.
The cause of this is simple. Look at the revenue mix for the two major vendors that control the market for large enterprise. Their business is about professional services – not software tools. They are in business to sell you, their customer, as much professional services at the highest rates you can imagine. Legacy ETL technology is a bonanza to a company that sells your company services by the hour.
Even worse, they don’t offer you the right mix of services and technology. The integration was done to specification but it doesn’t seem to work. I guess we’ll just have to sell you additional consulting to do the data quality clean-up.
The five best practices in data integration break this model. Queplix turns the model upside down. By using advanced data virtualization, and pushing back at the legacy monopoly attached to your wallet, you can reduce your costs and speed your time to implementation.
For parties interested in Queplix, data integration, data management, data virtualization, data integration software, master data management, businessintelligence, ETL, informatica, IBM cast iron
Publish Date: September 3, 2011 11:17 PM