Apparently, accurate and precise data quality in organizations' work processes has become part of successful factor for modern organization. Rapid growing in large volumes of data into enterprise creates new roles which records managers need to be more diligent in evacuate and distinguish more efficient practices. According to Howard (2010), "Based on current global survey accomplish by Bloor Research, founded that it is over 80% respondents from neither of IT and business surroundings has firmly agreed that accurate enterprise information is very crucial and most concern in organizational policy and practice" (p.1).
In fact, almost all entity of job flows in each organization need to be incorporated with effective data management or what is so called good data quality. And it has been supported by Mullins (2002), "In facing continuum challenge towards business intelligence improvement, many enterprises has decided to develop data integrate for effectively analyzing required data as well as for preservation purposes" (p.500). It underlies to every business decision, customer relationship, and business investment where certainly the obtainable data must be consistent, accurate, and really reliable especially in "peak-circumstance".
In order to come out with good data quality, constructive data integration need to be develop by organization and this will perform as an value added instrument, scheme and standards by means to clustered and aligned data assets in order to support business goals. In other word, data integration capable to graph effective strategies, cooperated and integrated with external infrastructure, federated or practical compilation as to generate good data quality and achieving organizational objective. It is parallel with today's business landscape which demanding enterprise to extremely reconsider on integration strategies such as faster processors, cloud computing and database innovations. The reason is to strengthen the company's data structural and architectures.
Practically, globalization business practice had strived enterprise to intelligently manage their collectable data through Master Data Management (MDM). Certainly MDM could deliver information efficiently in sense of real-time processing, active hub basis, synchronized capability and correct dissemination of quality information. This parallel to what Andreescu, A & Mircea, M. (2008) has discussed, "By implementing Master Data Management (MDM) into data integration strategies or to be specific Service Oriented Architecture (SOA) for both internally and externally, data could be arrange with more consistency mode, analytical alignment of information assets, accurate information distribution and aligned all the potential benefits of SOA plan". Meaning, the practiced of data integration by today's enterprise is more concentrate on software, middleware and management tools which is capable to connects different parts of application or a series of application.
In this event, DBMSs has come across into enterprise as the solution medium due to the vast changes that are going through. In these coming days, business societies are required to be more comprehensive in order to cope up and enforce to always in line with globalization competitive demands. The enterprise's database must be effectively functioning as data integration where the solution must be capable to give flexible support on the data structures and infrastructure which rapidly change due time and technology advancement. Part of that, selected DBMSs are capable and competent in integrated with neither of heterogeneous and homogeneous available system in the business market. This is very important, it may become necessity where enterprise need to communicate in sense of data transaction not only within the organization but as well as externally.
The data quality and integration ideas' and definition
Basically there are many explanations on data quality and integration concept. Accordingly, when it complies with data quality and integration dimensions, enterprise is unanimously seeking for accuracy, suitability, completeness, currency and relevance conceptual which is regularly labelled or determined by the professional. Pipino, L. L., Lee, Y. W. and Wang, R. Y. (2002) has studied that the data quality assessment is factually obligate and certainly the objective is to achieve accurateness in acquiring data quality. From the study, authors defined that company or enterprise shall execute with both subjective and objective key elements in quest of quality data metric (p.211-218).
Following are some previous literature regarding the broad understanding of data quality and integration concept. Trant (1993) has argued in his paper that data quality and integration concept is concern with some key factors. For instances, the concepts of accuracy, data requirement, and relevance metric analyzing. Furthermore, he also has discussed the accurateness of reliable data is resultant from efficient statistical metadata preservation and interpretation [1] . By referring to DHS methodological reports, Pulum (2008) has defined three main principal as indicators for data quality employment criteria (p. xvii). And according to
Seljak, R. and Zaletel, M. (n.d.). "Data quality and integration concept is concerning on how to achieved the enterprise objectives and goals. In which, company shall carry out good data quality timely, instantly, accurately and the most important are fulfilling the quality standards" (p.1).
Good data quality demand and tradition has been initiated since or during the mid 1990's at least.
Regardless, at the early stage there are still less awareness on effective data quality implementation, although most of organizations at that time are already practicing it indirectly. Still it has been undergoing quite sometimes globally whilst the value of data quality also has been growth and reform within an enterprise from time to time. Similar with data concept in context, there many ways data quality and integrations' definitions has been expressed but the motive is clearly to acquire trustworthy supporting business data.
Data is a phrase that representing and carrying a meaning for groups of information. While data quality is the products that have been choose systematically according to the specific requirement by particular enterprise. Data shall place as high quality if it is fit and fulfil customer requirement, enterprise operations and top management decision-making and in execute project.
It's to ensure the reliability and effectiveness of data before retrieve any data from data warehouse upon business circumstances or job process. Typically, data quality concerning synchronizing, retrieving, updating, analytical process and standardizing by neither single application nor, even if it run and operated in various dissimilar system. Consequently, Census Bureau defined quality as "most appropriate for use". Thus, it's identified and standardized throughout the customers' requirement (Tupek, 2006, p.3).
Though data integration is part of data quality process. Hence, data integration is particularly an application which reflects as attribute in a data warehouse. Data integration is mainly to support modern organization's information system. It means, the integration must be present as stable platform for data management and grant high level quality of data and information services. For this reason, data quality and integration can't be separately as it was bonding through strong dependable relationship. Clearly seen since in the era of 80's to 90's, where advances on information system or to be specific database system have been vastly develop from various background and subject areas. Positively, there are lots of infield researchers have develop more comprehensive query languages and richer data models. The emphasizing is to expand and increase trustworthy data quality which supporting enterprise work flows and decision-making (Ramakrishnan, 2000).
Agreeable with the importance of DBMSs in producing high quality clearly seen when only partially or small group is using anything other than computers to collect, deliver and preserve data these days. It was proven that computers are comprehensive and dynamic medium but it is become an issue when the collectable data was enter wrongly as the data input into the storage or data warehouse. Whilst enterprise has identified the error and in order to fix the created issue, enterprise need to spend billions of bills to alter enterprise's data processes to fit linguistic norms, cultural and, not vice versa. Ideally all the entered data must be match the implying patterns and norms. All the data arrangement must be addressed and presented accordingly. This costs more time and money in the design stage, but saves much more in data cleansing after capture and, importantly, allows increased data quality.
Poor data quality and integration effects and consequences
The advents of new advance technologies have a major impact on global business as organisations, in spite of of size, unanimously have a capability or access to a customer-base. As technologies growth at a vast pace, inviting close relation of data and integration method in manipulate and manage it to ensure each processing in enterprise shall lead by high quality of data. Weak arrangement of data quality will cause instance impact to organizational performance. According to the study by Data Warehousing Institute, Agostino has mentioned that it is approximately about $600 billion annual lost due to poor data implication (as cited in Wing Ning Li, Roopa Bheemavaram, and Xiaojun Zhang, 2006, p. 40). This lost is adverse consequence of poor data quality and integration which dragging modern organization less proactive activities, inappropriate decision-making, failure in fulfil customer satisfaction, increasing operational cost, reduce the ability to perform and execute project strategy. It also supported by Davernport (n.d.). "The issue of poor data quality is identified as essential causes and forceful contributors ineffectively to "information ecology" in the era of information age.
It's like a race for enterprise to overcome issues of poor data quality. Moreover and normally, enterprise needs to collect and manage global data efficiently. In sense where organizations need strategically discover effective integration solutions to handle the collectable data. By means enterprise will meets unavoidable various challenges, which that is the burden for enterprise need to overcome. Otherwise it will not easy to organize and it will not derive to cost effective. For international perspective, sequences of poor international data quality are alike for those of poor quality internally. Enterprise will face difficulty in using the data to produce strategic campaign checklist, reflecting in resulting weak response rates and not timely. Further with inability for enterprise to synchronized and integrated from multiple data sources. Inability to furnish particular data users with appropriate report and illustrate accurate conclusions from available data. In this literature, author rather to give more example specific on customer custody as it is part of most components received the impacts of poor data quality. The problematic creation up to each single customer in discussed subject. Continue with national boundaries of data transaction, poor data quality and integration drags to inability in synchronizing process to draw comparisons checklist for multiple enterprise. At end of the process, those implications will be bare by the enterprise.
Impact on enterprise operations
Enterprise shall gain the effects of poor data quality and integration in many means. Concerning at the operational level, mislead data distribution and usage shall create dissatisfaction among customer internally, externally, direct and indirectly. This responded to churcher (2008), "Everyone (refer to enterprise) shall preserve their business data. Even as Corporate organizations expenses billions of bills to look after their customer, wages, and operation data. The prices for getting it wrong are brutal: businesses may crumple, stakeholders and customers lose capital. In this event, enterprises willing to invest, install, design, and maintain such large database or information system as the preferred solutions as they can see the poor data quality and integration implication to the business operations" (p.xxi).
Back to the customer custody, in certain circumstances, customers have right and deserve to correctly received their stored name and address correctly, safely and timely received whatever products and services their require, promising privacy information properly kept and the most important is customer will be billed correctly. Unfortunately, things getting go around while this situation triggering dissatisfaction, time wasting and as overall certainly will relate increasing customer lost. Many customers merely expect the details associated with their order to be correct and most of them are intolerant to data errors. More or less, poor data quality and good support of integration application, certainly will increases operational cost. If not, time and other resources are need to spent detecting, correcting errors and involving human retention. These will affect the whole organization working chain or operations which are not recommended by any stakeholders.
In sense of customer perspective, supported data must be posses those three attributes which fulfilling customers requirements. Initiated with utility factors, which are refers to the effectiveness of particular information for users' purposes. Secondly, dependable to what is the objective. The analyzed data must be responded to the need or objective which is refers to the trustworthy of particular data and information. To be specific, the data must be always precise, reliable and unbiased manner. Lastly, the data integrity before it can be utilizes and manipulates. This refers to the guarantee particular information must be secured safe from any unauthorized access and any information licking.
Once the regulations have been followed, possibility to get strives from the impacts and consequences of poor data quality shall be decrease purposefully. To make better precaution or extra safeguard on data quality, Census Bureau has promoted modern organization to implying six dimensions of data quality in order to get fit with limited of resources surrounding at same time achieve the accurateness of data quality. The criteria are relevance, accuracy, timeliness, accessibility, interpretability, and transparency. Further explanations are as follows:-
Relevance - refers to the scale to which available data products grant information that meets the customers' needs.
Accuracy - responded to the dissimilarity between the forecasting from its true value or on field.
Normally it will characterize into two differences terms which are random (variance) errors and systematic (bias).
Timeliness - refers to the span of time between the suggestion period of the information and when the enterprise intent to hands over the data to users.
Accessibility - refers to the seamless access in which authorized user could retrieve any kind of related information accordingly.
Interpretability - refers to the availability of supporting documentation to assist customers to get better understanding and ease to use particular products.
Transparency - refers to providing credentials about the assumptions, techniques, and limitations of a data good to allocate skilled third parties to reproduce the information, unless prevented by trusts or other legal limitations.
Thus, poor data quality and integration really important and need to be address closely on the related issues. The implication and consequences certainly give bad effects in many ways. Despite increasing enterprise operation cost, poor data quality resultant instability for mid and long run enterprise business functions.
Conclusion
Modern organization need to closely understand and always in line with the advent technologies and at the same time keep intact with current solution available. The purposes are certainly to achieve high quality of providing data besides effectively integrate with available systems analyst. In order to produce good data quality, enterprise use practicing data service integration and exchange by SOA where is made through unique language of data content and data structure supported by MDM. It functions is to aligning master data between various systems, departments, collaborators, processes, policies and procedures that suppose to allow defragmentation available data consistently during new information is entered and updated.
As overall, SOA functioning in connecting 'liveware', data processes and high quality of information via integrating systems despites providing a viable platform and to develop new functionality.
Focusing on high quality of data quality, it embraces enterprise business process to the neither achievement nor failure in the long-run. These had been discussed on the impacts and sequences of poor data quality especially upon business operations and customers' dissatisfactions. Many scheme or framework that have been guide to ensure data quality but the important thing is how enterprise could understand and maintain data accuracy in an organization internally and externally.