Evaluation of Information Quality From the User’s Perspective

Source: Adobe Stock, Alek

Authors: Klauss Nogueira; Rafael Albuquerque da Silva; and Tiago Menegardo, auditors of the Specialized Information Technology Audit Unit of TCU (Brazilian Federal Court of Accounts).

This article presents a methodology to assess the effective transparency of information published on public entities’ portals. It considers not only the mere availability of information but also quality requirements of information, such as timeliness, completeness, conciseness, accuracy, clarity, reliability, accessibility, and relevance.

The methodology involves evaluating the perception of information users, as the value of information depends on the user and the context in which it is considered. 

The mentioned methodology was applied in an audit conducted by the Brazilian Federal Court of Accounts (TCU) between 2020 and 2021. This audit evaluated the portals of the Ministry of Health, the Chamber of Deputies, and the TCU itself.

Quality Requirements of Information

Quality requirements of information are characteristics that can be used for analysis and measurement of information quality, considering subjective and objective aspects. This includes characteristics related to the information itself and those related to its use. 

Therefore, since information is subject to various criteria by which its quality can be assessed, it was essential to quantify the value of the information through evaluating its quality requirements to verify the effectiveness of the provided information. 

Consequently, after analyzing information science academic studies and the Enabling Information reference guide from Cobit 5, the following information quality requirements were adopted: timeliness, accessibility, clarity, correctness, reliability, conciseness, relevance, open format, security, and completeness.

However, since the audit’s objective was to verify the real usefulness of the information from the users’ perspective, the security attribute was not effectively assessed from that standpoint. It was evaluated using a self-assessment questionnaire on security, as conducted by the respective organizations, which is beyond the scope of this article. Similarly, the “open format” requirement, being more technical, was evaluated by the audit team itself using a specific evaluation model.

Methodology

The evaluation started by selecting the portals based on input from representative groups of society, such as business entities, government entities dealing with public data, social control organizations, consumer protection institutes, and researchers working with public data. After the consultation, 192 portal pages containing important information for these entities were listed, and the portals of the Ministry of Health, the Chamber of Deputies, and the TCU were the most cited.

The audit team then met with each portal’s managers to understand the information publishing process and the existence of procedures to ensure information quality.

Data Collection

To gather user perceptions regarding information quality, the audit team used two testimonial data collection techniques: focus groups and a survey with an evaluation questionnaire.

Focus Groups

The focus groups involved specialized users of the evaluated portals, such as journalists, lawyers, researchers, and representatives from social control entities, among others. The goal was to discuss the quality of information published on the portals from their professional perspective. 

Survey with Evaluation Questionnaire

An evaluation questionnaire was designed to quantify the value of information published on the portals concerning the chosen quality requirements. The respondents could provide their perceptions and feedback through both rating scales and free-text responses.

The questionnaire included the following questions related to various aspects of information quality:

  • The way to find and obtain information on the portal is simple and uncomplicated (Accessibility).
  • The information and data found on the portal are relevant and meet your needs (Relevance).
  • The information found on the portal is easy to understand and interpret (Clarity).
  • The information and data found on the portal are correct, i.e., free of errors such as grammatical and spelling mistakes, incorrect symbols, values, and units, etc. (Correctness).
  • The information found on the portal is complete, meaning there is all the necessary and sufficient data to make it useful (Completeness).
  • You trust the information and data found on the portal to make decisions or perform tasks (Reliability).
  • The information found on the portal is sufficiently up-to-date (Timeliness).
  • The information and data found on the portal are presented concisely, meaning they have an appropriate level of detail and do not contain unnecessary elements (Conciseness).

The survey respondents were mainly reached through collaboration from the portal managers, who published notes and news on their portals and official social media accounts, encouraging users to participate.

Results

The data collected from users indicated that important quality requirements established in legislation and best practices were not fully met by the portals of the TCU, the Chamber of Deputies, and the Ministry of Health, potentially compromising users’ effective utilization of the published information.

Some issues highlighted were difficulties in locating certain information, inadequate timeliness, lack of conciseness, and insufficient data for usefulness.

However, overall, users believed that the portals information was relevant, clear, correct, and reliable. 

In addition, interviews with the portal managers revealed that there was no internal process for evaluating the quality of published information, considering the audited quality requirements and involving user feedback.

Conclusion

The evaluation of effective transparency of information on public entity portals must not neglect information quality requirements, such as timeliness, completeness, conciseness, accuracy, clarity, reliability, accessibility, and relevance. 

It must also consider the users’ perception of the information since the value of information depends on the user and context.

The evaluation should involve different user groups, including the general public, experts, academics, professionals related to the portal’s field, journalists, decision-makers, people with limited experience in consuming information, among others.

By the way, it should be clarified that the selection of other different criteria to assess information quality, as well as the choice of another data collection methodology and evaluation, are feasible. This makes the work easily replicable by other public audit organizations.

Finally, besides fostering the debate on the subject, it is understood that ensuring the information published on public entities’ portals follows information quality requirements can contribute to disseminating useful information to society, facilitating the effective exercise of social control, as well as combating the spread of fake news.

Beyond its direct applications in audits, this methodology’s adoption can spark valuable discussions and positive changes within the realm of public information dissemination. By emphasizing the importance of adhering to information quality requirements, SAIs can contribute to the dissemination of reliable and valuable information to society at large. This not only aids in the effective exercise of social control but also plays a pivotal role in countering the proliferation of false information and misinformation. As the debate surrounding information quality gains momentum, this methodology can act as a catalyst for broader improvements in transparency and accountability practices across various sectors.

The full report of this audit can be found at tcu.gov.br by searching for: “ACÓRDÃO Nº 878/2022 – TCU – Plenário; TC 037.554/2020-4”, or requested from klaussho@tcu.gov.br

Back To Top