Skip to main content

Current state of dental informatics in the field of health information systems: a scoping review

A Correction to this article was published on 28 May 2022

This article has been updated

Abstract

Background

Over the past 50 years, dental informatics has developed significantly in the field of health information systems. Accordingly, several studies have been conducted on standardized clinical coding systems, data capture, and clinical data reuse in dentistry.

Methods

Based on the definition of health information systems, the literature search was divided into three specific sub-searches: “standardized clinical coding systems,” “data capture,” and “reuse of routine patient care data.” PubMed and Web of Science were searched for peer-reviewed articles. The review was conducted following the PRISMA-ScR protocol.

Results

A total of 44 articles were identified for inclusion in the review. Of these, 15 were related to “standardized clinical coding systems,” 15 to “data capture,” and 14 to “reuse of routine patient care data.” Articles related to standardized clinical coding systems focused on the design and/or development of proposed systems, on their evaluation and validation, on their adoption in academic settings, and on user perception. Articles related to data capture addressed the issue of data completeness, evaluated user interfaces and workflow integration, and proposed technical solutions. Finally, articles related to reuse of routine patient care data focused on clinical decision support systems centered on patient care, institutional or population-based health monitoring support systems, and clinical research.

Conclusions

While the development of health information systems, and especially standardized clinical coding systems, has led to significant progress in research and quality measures, most reviewed articles were published in the US. Clinical decision support systems that reuse EDR data have been little studied. Likewise, few studies have examined the working environment of dental practitioners or the pedagogical value of using health information systems in dentistry.

Peer Review reports

Background

Advances in dentistry largely depend on developments in information technology. Introduced by Zimmerman et al. in 1968 [1], “dental informatics” refers to the application of computer and information sciences to dentistry with the aim of improving clinical practice, research, education, and management [2, 3].

Some of the advances made in dental informatics include applications such as diagnostic devices, 2D or 3D digital acquisition, computer-assisted design and manufacturing, and computer-assisted surgery [4, 5]. These advances were examined in a systematic review published in 2017 [6].

Other studies have investigated the development of computerized health information systems (HISs) in dentistry. Health information systems are used to collect, store, process, and transmit the information needed to organize and implement care [4, 7]. One well-known HIS component is the electronic dental record (EDR), which is used by practitioners to document both patients’ medical and dental history and detailed information on consultations. Given the obvious benefits of EDRs, especially in the context of large clinical institutions, EDR overall adoption rate in the US increased from 52% in 2012 [8] to 77% in 2017 [9]. Indeed, EDRs are not a simple transposition of paper records. Ideally interoperable with other HIS components, they allow to control data capture, facilitate data storage and access, support administrative and management processes, and guide public health policies. They can also be used in research and education [10,11,12,13].

To take full advantage of EDRs, particularly with regards to the communication, aggregation, and reuse of data, standardized clinical coding systems (SCCSs) are needed that are scalable, shareable, and adapted to the dentistry domain [13]. Such systems make it easier machine-readable documentation, and allow for computerized comparisons of the outcomes of different treatments for the same diagnosis [13, 14].

In the early 2000s, no consensus-standardized nomenclature for dental diagnoses and treatment outcomes was available [13]. The lack of a collective strategy among governments, health centers, and software developers strongly limited progress in this area [15]. Moreover, the most widely used coding system at the time, the International Classification of Diseases (ICD-9 [16]), provides limited coding for dentistry and is inadequate for making appropriate dental diagnoses [17]. Several studies have attempted to overcome this problem by proposing different coding systems, including EZCode [18], SNODENT, Ontology for Dental Research [19], Oral Health and Disease Ontology [20], etc. The available literature on coding systems in dentistry is confusing at first glance. Some of the proposed classifications have been redefined or renamed several times, and, in some cases, they have been merged with other coding systems to fit the needs of dentistry.

In addition to standardization issues, clinical dental coding usage is mainly considered as an academic concern whereas private practitioners should also be concerned by this [2]. Difficulties in capturing standardized data may explain why private practitioners in dentistry have failed to adopt SCCSs. Indeed, making the capture of standardized data more efficient is essential to improve usability, workflow integration, but also data quality.

The reuse of EDR data holds much promise in the area of research. Not only can it diminish the costs and inefficiencies associated with clinical research, but shared EDR data warehouses can surpass many registries and data repositories in volume. Moreover, EDRs are real-world data sources that can be used in studies to produce real-world evidence, which in turn can help accelerate advances in care, improve outcomes for patients, and provide important insights for daily practice. Like other forms of retrospective research, EDR-based retrospective studies require neither patient recruitment nor the collection of new data, both of which are expensive and time-consuming. While the reuse of EDR data is a promising step towards decreasing research costs, facilitating patient-centered research, and speeding the rate of new medical discoveries [21], it is nevertheless limited by data quality concerns. Indeed, it is generally accepted that due to differences in priorities between clinical practice and research, clinical data are not recorded with the same care as research data [21]. Thus, in addition to proper ergonomics and workflow integration, there is a need for data capture forms that can intercept basic errors and provide real-time feedback to keep the user informed of what is going on, thus ensuring that the most accurate and complete information is collected.

In addition, EDR data can be used in clinical decision support systems (CDSSs) to provide real-time patient-centered clinical recommendations. They can also be used in educational settings to monitor students’ technical and theoretical knowledge as well as their clinical activity.

The primary objective of this scoping review was to summarize studies on SCCSs and EDR data capture in dentistry. The secondary objective was to explore the practical implications of reusing EDR data in CDSSs, quality measure development and clinical research.

Methods

Based on the definition of HISs [22], the literature search was divided into three specifics searches:

  1. 1.

    HISs in dentistry and SCCSs

  2. 2.

    HISs in dentistry and data capture

  3. 3.

    HISs in dentistry and reuse of routine patient care data

For each of the searches, a review of the literature was conducted in August 2020 and then updated on January 2021 following the Preferred Reporting Items for Systematic Reviews and Meta-Analyses extension for Scoping Reviews (PRISMA-ScR) protocol. PubMed was used to search the MEDLINE bibliographical database, and Web of Science was used to search all databases. Studies published between 1 January 2000 and 9 January 2021 were selected. This long period of inclusion helped to account for developments in the field and for the current state of knowledge on each of the subjects discussed.

The search strategies used in this scoping review are presented in Table 1. Studies were selected according to the inclusion criteria established for each search (Table 1).

Table 1 Search strategies

Studies were excluded if they met any of the following criteria: (a) publication not in English; (b) study not specifically related to health information systems in clinical dentistry; (c) study using only EDR data without consideration of HISs; (d) publication in the form of a letter, editorial/opinion, abstract, conference abstract, case report, or book chapter.

For each search, duplicates were removed, and articles were initially selected based on their titles and abstracts. When the abstract did not provide sufficient information, the full text was read. All stages of the search were carried out by the authors and then carefully checked to minimize bias in the review process. In case of disagreement, the decision was made by consensus.

The flow diagrams of the three searches are summarized in Figs. 1, 2, and 3.

Fig. 1
figure 1

PRISMA flow diagram of the search for studies on health information systems in dentistry and standardized clinical coding systems

Fig. 2
figure 2

PRISMA flow diagram of the search for studies on health information systems in dentistry and data capture

Fig. 3
figure 3

PRISMA flow diagram of the search for studies on health information systems in dentistry and reuse of routine patient care data

Only the first affiliation of the first author was considered in the geographical analysis of publications.

Results

A total of 44 articles were selected for review. As regards geographical distribution, 31 articles (70%) had a first author affiliated with the United States, and the other 13 articles (30%) had a first author affiliated with Australia, Canada, China, Czech Republic, India, the Netherlands, Poland, Saudi Arabia, Sweden, or the United Kingdom.

Health information systems in dentistry and standardized clinical coding systems

Table 2 summarizes the articles selected in the first search.

Table 2 Articles selected in the search on “health information systems in dentistry and standardized clinical coding systems”

Design and development of standardized clinical coding systems

Articles [2, 14, 17,18,19,20, 23, 25, 32] trace the design and development of SCCSs in dentistry, as shown in Fig. 4.

Fig. 4
figure 4

Design and development of standardized clinical coding systems [2, 14, 17,18,19,20, 23, 25, 32]. DDS: dental diagnostic system; OBO: open biomedical ontologies; ANSI: American National Standards Institute

All SCCSs examined in this scoping review are independent of national procedures codification systems except for that proposed by Lam et al. [2]. As a result of evolution over time, only two SCCSs are available nowadays:

  • SNOMED (and its subsets)—license-based access.

  • The Oral Health and Disease Ontology designed by Schleyer et al. [20] on the Open Biomedical Ontologies (OBO) Foundry framework [33]—freely available.

Evaluation and validation of standardized clinical coding systems

In 2005, Goldberg et al. [24] used a software-based method to evaluate the internal quality of SNODENT. This method consisted of searching for errors in ontology by comparing different ways of extracting information from terms, concepts, descriptions, and definitions. The authors found that SNODENT had quality issues, mainly due to confusion between terms and concept codes (for example, unclear relationships between terms and concepts, polysemic concepts, subsumption problems, etc.).

Practice-based evaluations were performed only for Z codes [18], the Dental Diagnostic System (DDS) [25, 28, 30], and SNODENT [31]. Some studies evaluated diagnostic code entry by determining the plausibility of the entered diagnostic code based on the entered treatment procedure code [18, 25, 28, 30], while others compared written diagnoses to diagnostic codification [31].

The validation scores obtained in these studies are provided in Table 3.

Table 3 Evaluation and validation of standardized clinical coding systems

The selected studies highlighted different kinds of errors and explored various avenues for the improvement of SCCS usage.

Some studies found that the more the SCCS is comprehensive, the less it is easy to navigate and the more complicated it is for the practitioner to use [18, 25]. In addition, videos analyses have shown that practitioners find it sometimes difficult to determine which code or concept represents the right choice, especially when different codes or concepts have similar meanings [31].

The modes by which diagnostic terms are found and entered were also shown to be a source of error. Accordingly, a human–machine interface that can support accurate and complete SCCS-based documentation should be developed. Moreover, data capture forms should be made more intuitive, quick, and easy to use. Lastly, control mechanisms with appropriate user feedback should be put in place to limit common errors [25, 31].

Finally, errors may be due to practitioners themselves, as these often have insufficient knowledge of SCCSs. Tokede et al. [25] highlighted practitioners’ lack of awareness of the impact of using standardized vocabulary. On the other hand, Yansane et al. [30] showed that practitioners are becoming increasingly familiar with EDRs and SCCSs as they get to use them, and that this improves user experience and the quality of data entry. Coding errors may also reflect the miscalibration of diagnostic criteria. In their study based on clinical cases of carious lesions, Sutton et al. [28] found that participants were just as likely to choose an incorrect diagnostic code as the correct one when recording cases of enamel-limited lesions. In their view, this suggests both the need for faculty calibration (particularly in the field of diagnostics) and the educational value of using diagnostic codes.

In 2015, Reed et al. [26] evaluated the impact of SCCS exposure on dental faculty students’ scores in the Health Sciences Reasoning Test (HSRT). The HSRT is designed to measure critical thinking skills, and is specifically calibrated for health science practitioners and students in health science educational programs. The authors showed that students exposed to SCCSs (in this case the DDS) had a significantly higher Health Sciences Reasoning Test score than those who had not.

Adoption of standardized clinical coding systems in academic settings

In 2017, Ramoni et al. [14] conducted a survey of US dental school deans about their usage of SCCSs. The response rate was 57% (35/61). A total of 32 deans reported using an EDR to document patient care and for administrative purposes such as billing; of these, 84% used the AxiUm EDR system. Twenty-nine deans were familiar with the DDS, but only ten had loaded it into their EDR for clinical use. Two schools used a self-selected subset of the SNODENT ontology, and five schools used the dental terminology of the ICD, 9th Revision [16].

To the authors’ knowledge, only one large private clinic had implemented the DDS in the US in 2017. By contrast, the DDS was recommended as standard SCCS in the Netherlands in 2015 [14].

User perception of standardized clinical coding systems

Two articles assessed users’ perceptions of SCCS (see Table 4) [27, 29]. They both concluded that users have a positive attitude towards standardized diagnostic terminologies (DxTMs). In another survey by Ramoni et al. [27], the highest average score on the Likert scale (a psychometric tool used to measure the degree of agreement with a proposal) was associated with the item “Standardized dental diagnostic terms would allow dental team members to use the same term to describe the same diagnosis” followed by “standardized dental diagnostic terms would be useful.” However, 16% of the responses reflected confusion about what standardized dental diagnostic terminology entails, and participants expressed doubt that the use of SCCSs would result in better dental care [27].

Table 4 User perception of standardized clinical coding systems

In the study by Obadan-Udoh et al. [29], academic and non-academic users proposed a series of strategies to improve SCCS usage. These strategies can be divided into three types:

  • Political strategies, including usage obligation and financial incentives.

  • Educational strategies, including user training on the purpose and benefits of SCCSs.

  • Technical strategies, including smooth incorporation of SCCSs into EDRs with easier user interface and streamlined workflow.

Health information systems in dentistry and data capture

Table 5 presents the articles selected in the second search.

Table 5 Articles selected in the search “health information systems in dentistry and data capture”

Data completeness

Good record keeping is a fundamental professional and legal obligation. However, studies conducted in Australia, the United Kingdom, Scandinavian countries, and Egypt show that clinical dental record keeping practices do not meet basic standards [10, 41]. Thus, a 2000 study revealed that patient clinical information was absent in 9.4% to 87.1% of EDRs [45]. In 2016, Tokede et al. [10] used a Delphi process to determine what data should be entered in dental records and how often each clinical entry should be updated (Additional file 1: Appendix A). In so doing, they emphasized the need for consensus on what data is necessary. The assumption is that practitioners are less likely to record information deemed unimportant or worthless, causing the problem of incomplete or inaccurate data entry to persist [10]. However, not taking into consideration practitioners’ information needs can lead to organizational difficulties, prompting the use of both paper and electronic forms for data documentation. In this regard, it should be noted that Thierer et al. found that the rate of documentation of required data in progress notes increased from 61 to 81% after an educational intervention [45].

User interface and workflow integration

Improving user interfaces is an important concern [10, 11, 39, 40, 42, 43]. The following issues should be addressed: unintuitive interfaces, complex navigation within the interfaces, insufficient user feedback, complex structured data capture.

Several user interface evaluation techniques were identified in the literature (Table 6).

Table 6 User interface evaluation techniques

In their 2014 article, Walji et al. [42] evaluated 3 different methods (user testing, semi-structured interviews, and surveys) for detecting usability problems in an EDR. They concluded that the user testing method is better at detecting usability problems than the other two methods but that a combination of different complementary techniques is needed to provide a more comprehensive picture of EDR usability challenges.

According to [39, 42], issues related to EDR usability can be divided into three categories: user interface-related themes, SCCS-related themes, and work domain- and workflow-related themes.

More generally, the problems identified in the selected articles were: the lack of intuitiveness of the EDR interface, inadequate user guidance, and poorly organized controls. All of these were shown to impair users’ ability to determine how to perform a desired action and on which object the action should be performed [35, 40].

Lastly, the lack of easy and consistent access to patient data in most EDRs, and in particular the absence of an integrated view of the patient, requires users to switch between separate screens to see radiographs, intraoral photos, and clinical notes. This makes navigation cumbersome and introduces breakdowns in workflow [11, 38, 43, 44].

Other reported issues are summarized in Table 7.

Table 7 Issues related to the user interface and the workflow integration

Data capture solutions

Several studies proposed technical solutions to overcome difficulties in capturing data in EDRs.

In 2002, Chadwick et al. [34] suggested using barcodes to record undergraduate clinical activity in an academic environment. In 2009, Irwin et al. [36] developed and evaluated natural language processing aimed at extracting structured information from clinical notes to automatically annotate clinical notes with unified medical language system codes [46].

Finally, a voice-supported EDR was developed and tested in the field of temporomandibular joint disorders [37].

Health information systems in dentistry and reuse of routine patient care data

Table 8 summarizes the articles selected in the last search.

Table 8 Articles selected in the search “health information systems in dentistry and reuse of routine patient care data”

Our scoping review found that routine patient care data are reused for three main purposes:

  • In CDSSs, with a focus on patient care [47,48,49,50,51,52].

  • In institutional or population-based health monitoring support systems, with a focus on the qualitative evaluation of care via quality measures [53,54,55,56].

  • In clinical research, with a focus on the discovery of new knowledge [57, 58].

Clinical decision support systems

Clinical decision support systems are computer programs designed to provide expert support for health professionals making clinical decisions [47]. These applications may be standalone systems, or they may interact with and reuse data from other tools, including EDRs [47,48,49, 59].

In both our review and that of Sayed et al. [51], only three articles were found that proposed and evaluated a CDSS reusing EDR data [49, 50, 60] (Table 9).

Table 9 Clinical decision support systems reusing electronic dental record data

Studies of CDSSs concerns all areas of dentistry [52]. They rely on different data analysis techniques: namely, algorithmic systems, neural networks, probabilistic systems, logical/deductive systems, critiquing systems, model hybrid systems [52].

The main limitations of CDSSs are:

  • The validity of CDSSs is mostly established internally, in narrow domains, and under varying conditions and technologies. Most CDSSs were not formally evaluated, and their value for clinical practice could not be established [47, 52].

  • CDSSs are proliferating as fragmented and isolated systems with a few clinic- or hospital-wide exceptions in academic centers [47, 48].

  • Structured data capture remains a challenge for all clinical information systems, including for CDSSs [47].

Quality measures

The National Quality Forum defines quality measures as “tools used to quantify the care provided to patients and gauge how improvement activities are indeed improving care or outcomes for certain conditions, in various settings, or during a specific timeframe” [53]. Quality measures concern all areas of health care delivery and population health, as defined by the National Quality Measures Clearinghouse: access, process, outcome, structure, use of service, health state, cost, and efficiency [53]. In 2011, the Health and Medicine Division [division of the National Academies of Sciences, Engineering, and Medicine] noted that the lack of quality measures acted as a barrier to improving oral health and reducing oral health disparities, and that quality measures in dentistry “lag far behind” those in medicine and other health professions [53,54,55].

Researchers, US state dental programs, and the US Dental Quality Alliance have sought to develop quality measures in the field of dentistry [53, 55, 56]. Except for some e-measures, all proposed measures were derived from administrative or claims-based data [55]. In the systematic review by Righolt et al. [54], only 2 out of 24 studies reused EDR data, specifically assessing the feasibility of an automated EDR-based quality measure (Table 10).

Table 10 Quality measures based on electronic dental record data

At present, the reuse of EDR data for quality measure development is far more common in the US than in Europe [53]. As EDR data are more detailed than claims data, they are considered more suitable for conducting quality measures [53]. Furthermore, the reuse of EDR data can advance quality measures through the automation of data collection, but also to increase transparency by availing access to information that would not be accessible otherwise [54]. However, the slow development of SCCSs and the increasing use of treatment procedure codes as a substitute for diagnosis severely limit both the reuse of EDR data for quality measure development and the ability to fully assess the impact of provided care [53].

It should be noted that Obadan-Udoh et al. [55] highlighted the ethical challenges posed by quality measures, in particular the risk of losing focus on the patient and that of compromising provider and patient autonomy.

Clinical research

The production of evidence-based knowledge using EDR data places the latter into a continuous cycle of improvement known as the Learning Health Care System [57]. Thus, after extraction, validation, and analysis, data from clinical practice can generate new knowledge, which in turn can influence clinical practice (Fig. 5).

Fig. 5
figure 5

Learning Health Care System

In their systematic review, Song et al. [57] examined 60 studies that reused electronic patient data for dental clinical research. More than half of these studies addressed epidemiological topics, with a particular focus on the association between risk factors and various dental or medical conditions. All but two studies were retrospective, and most studies (72%) were conducted in the US.

The most frequently reported advantage of reusing EDR data is that they allow to conduct studies more effectively, at a lower cost, and with greater statistical power due to large sample sizes [57, 58]. Moreover, EDR data are deemed valuable because they constitute a rich resource for outcomes research. They help detect rare events or diseases and reduce study time [57, 58].

Over half of the studies examined in the systematic review by Song et al. [57] considered data availability and quality to be major limitations, as attested by the frequent presence of inaccurate, inconsistent, incomplete, or missing data. These limitations stem mainly from the fact that some data are not routinely documented in EDRs. However, they can also be attributed to coding errors or to inconsistent data documentation practices caused by the multiplicity of uncalibrated providers tasked with entering data [57].

Greater standardization of EDR data and increased adoption of public health databases and registries are needed to make EDR data more accessible in dental research [57]. In the US, a shared data warehouse named BigMouth was launched in August 2012, making data on 1.1 million patients available to users in the four contributing dental schools [58]. Nowadays, ten institutions have contributed to BigMouth by providing data on more than 3 million patients [63].

Discussion

This scoping review traced all developments of the last decades in dental informatics, with a particular focus on SCCSs, data capture, and reuse of routine patient care data. To our knowledge, this is the first review to provide such a broad overview of the field.

Principal findings

Most selected studies were conducted in the US. Righolt et al. and Song et al. [54, 57] made the same observation in their review of the literature on EDR data reuse. This finding points to a great disparity in the development of dental informatics, even among so-called developed countries.

The use of standardized codes and terms for treatment procedures is ubiquitous in many countries—e.g. Current Dental Terminology in the US, Uniform System of Coding and List of Services in Canada, Classification Commune des Actes Médicaux (Common Classification of Medical Acts) in France, and UPT codes in the Netherlands [25]. These codes are routinely used in dentistry to facilitate the recording of medical procedures in patients’ charts, the preparation of patient billing, and the transmission of data to third-party payers for patient reimbursement [18, 23]. Although these coding systems are comprehensive, they stay focused on treatment and do not allow to describe patients’ conditions. In view of this, another type of SCCS has been proposed in dentistry: DxTMs. Unlike free-text notes, DxTMs allow HIS users to directly access information on patients’ conditions, track clinical outcomes, monitor best practices, and develop evidence-based guidelines [23, 27].

In the field of medicine, DxTMs have been in use for decades [18]. The best known DxTM is the ICD, which was adopted in 1900 as an international standard for describing diagnoses (it was then called the International List of Causes of Death) [32]. From the start, oral health diagnoses were hardly represented in this ICD. Efforts were initially made to include new diagnoses, and in 1969 the first version of the Application of the ICD to Dentistry and Stomatology (ICD-DA) was issued [64]. Nevertheless, some authors have highlighted the inadequacy of the existing ICD-DA terminology for oral diagnosis documentation [17, 32].

In addition to ICD revisions, several proposals have been made to adapt SCCSs to dentistry. Of these, two ontologies are now available for use: SNOMED CT (and its subsets) (Fig. 6) and the Oral Health and Disease Ontology.

Fig. 6
figure 6

SNOMED CT and its subsets [14, 32, 65]

Recently, some of the Oral Health and Disease Ontology developers have begun to participate in the revision and development of SNODENT. Therefore, a rapprochement between SNODENT and the Oral Health and Disease Ontology is likely, especially in terms of their use and implementation [66].

In the academic setting, the use of SCCSs can help reinforce student reasoning of why specific procedures need to be performed. Some EDRs used in dental schools allow students to enter three types of diagnoses: tentative, working, and definitive [18]. This process is of prime pedagogical interest as it allows to appreciate students’ clinical reasoning and its evolution over time. Likewise, determining the plausibility of the entered diagnostic code based on the entered treatment procedure code opens interesting pedagogical perspectives. More generally, SCCSs can be used to monitor teaching and knowledge acquisition at both the collective and individual levels. They can also highlight gaps in teaching or ambiguities in diagnostic classification, as shown by Sutton et al. [28], who highlighted the need for faculty member calibration in the area of dental caries classification diagnosis. Reed et al. [26] found that the use of SCCSs has a positive impact on the education of dental students. This is consistent with the findings of a study on the pedagogical value of using SCCSs in other medical fields [67].

Although SCCSs have been developed specifically to describe patients’ conditions, their validation in practice is essential. Indeed, EDR data must be consistent with clinical reality to limit the need for post-capture data cleaning and to ensure correct inferences [25]. In order to fulfill their role effectively, EDRs require complete and accurate capture of clinical data [10], with a view to their possible reuse outside of care.

The semantic and syntactic proximity of certain terms tends to result in misuse, especially when users are not trained. These terms should therefore come with extra textual definitions. This is not the case yet for SNODENT, and the question of what SNOMED terms actually represent remains a matter of debate [66].

Moreover, the complexity of SCCSs makes them difficult to navigate from the EDR user interface, which can negatively impact data capture.

Many issues can thwart data capture: a lack of universally accepted documentation standards and information needs, incomplete or inaccurate record practices, lack of usability of EDR user interfaces, a lack of easy and consistent access to patient data, and inaccurate workflow integration [10, 12, 39, 44]. These various problems, both technical and socio-organizational, condition the usability of EDRs [11, 35, 39]. One of the main challenges of data capture is to ensure the customization of EDRs according to dental department needs or user expectations [11, 43].

Practitioners typically spend more time than they would like in properly documenting clinical information, which reduces the time available for patient care or other activities [10]. The need to improve user interface has been identified as a major concern in the literature, as has the need to streamline workflow integration [10, 11, 39, 40, 42, 43]. For this purpose, EDRs should support the entire process of care by enabling the capture and display of all necessary information at the right time. Health care teams could adapt to simplify data capture and free up practitioners’ time. For example, dental assistants or patients could fill out forms electronically before appointments, such that practitioners would only need to review, update, and pull the information into the EDR [10]. However, overly important adaptation requirements, missing functionalities, and incompatibilities with clinical practice can generate workflow problems [39]. This requires taking into consideration practitioners’ environment, habits, and clinical specialization. Although it is difficult to customize EDRs to each practitioner, there is a need for EDRs that can adapt to the most common situations.

No study on the specifics of dental practitioners’ working environment was identified in our review. This is unfortunate, as the dental chair would benefit from being interfaced with the EDR. This would greatly facilitate the integration of data capture into processes at multiple levels (care, radio examination, etc.).

As regards care data reuse, EDRs have received increased attention because they help to expand evidence-based knowledge [57], assess needs, and improve quality of care. The most obvious obstacles to the widespread reuse of EDR data are the lack of standards in dentistry and data availability and quality issues [55, 57, 58]. The reuse of EDR data can complement traditional research methods or can function in synergy with them [57]. In the US, the work carried out on SCCSs has allowed for significant progress, in particular with the creation of the shared data warehouse BigMouth [58].

In addition, EDR data can be used to evaluate health changes and to monitor dental service utilization, care delivery, and disparities between treatment needs and provision [55, 57]. While the data most frequently used for this purpose are derived from dental insurance claims, EDRs allow for better quality measures because they contain more detailed data on patients [53]. The World Dental Federation has defined quality as an “iterative process involving dental professionals, patients, and other stakeholders to develop and maintain goals and measures to achieve optimal health outcomes” [56]. This definition highlights the need to develop quality measures in dentistry to improve patient care, but also to reduce costs and enhance patient experience [53,54,55,56]. Despite the interest in using EDRs for quality measure development, particularly in the areas of data collection automation and transparency, only two publications were found that examined quality measures based on EDRs [54]. According to Hunt et al. [53], faculty should implement quality measurement processes in their clinical programs to prepare graduates for their future practice.

Lastly, EDRs can be used with CDSSs to improve the quality of care. Indeed, CDSSs can provide real-time quality assurance, support treatment planning, generate alerts, or remind clinicians of the need to perform routine tasks for patients with potentially risky conditions [47, 48]. Despite the recognized need for CDSSs, their implementation has been limited by a lack of formal evaluation, challenges in developing standard representations, a lack of studies on decision-making processes, the cost and difficulties involved in the generation of a knowledge base, practitioner skepticism about the value and feasibility of decision support systems, etc. [48]. Ideally, CDSSs should be integrated into EDRs, as this allows physicians to capture data only once, without extra costs or workflow breakdown. Within the limits of this scoping review, only three studies proposed an EDR with an integrated CDSS [49, 50, 60].

Limitations

The main limitation of this review lies in the search strategy, which may have prevented us from identifying all studies of interest due to limitations in database coverage and to the particularities of article indexing. To compensate this limitation a manual bottom-up search of the references of each selected article was performed.

Regarding the reuse of routine patient care data, our work aimed only to include studies addressing data reuse in view of information system. Therefore, studies that reuse electronic data without consideration of HIS were excluded from this scoping review.

Another area of research related to data reuse is the Dental Practice-Based Research Network (DPBRN). The DPBRN brings together solo and large group practitioners to accelerate the development and conduct of clinical studies on important issues in oral health care [68]. In this scoping review, no articles concerning the DPBRN could be included due to the bibliographic search strategy or the inclusion and exclusion criteria. Although in 2013, the overwhelming majority of dental PBRN studies used paper forms for data [68], some studies determined the feasibility of conducting clinical research using the electronic dental record within DPBRN [69, 70].

Conclusion

Since its introduction in 1968, dental informatics has gradually developed in the field of HISs and has tried to catch up with advances in medicine. As was the case in the medical setting, the various issues raised by the standardization, capture, and reuse of data had to be addressed. In addition to technical difficulties, HISs present socio-organizational problems linked to workflow integration, human–machine interface, the use of EDRs in the academic setting, and scientific and ethical issues.

Given the new paradigm of clinical data reuse outside of care, it is necessary to maximize both ease-of-use and the workflow integration of EDR data capture [10]. It also seems important to give practitioners access to data or indicators such as quality measures to allowing them to manage and improve their clinical activity, but also encourage them to capture data in an optimal way.

Efforts in terms of standardization and interoperability have led to concrete progress that allows EDR data to be aggregated with other health and non-health data (e.g., geographic data) to generate new broader knowledge. Despite these advances, strong governance seems fundamental to achieve concrete achievements such as inter-university data warehouses [71].

Some studies selected in this review assessed the educational value of HISs, but only in relation to SCCSs. The latter can indeed be used to evaluate students’ diagnostic abilities in a clinical situation. Clinical data linked to care provided by students would benefit from being exploited.

They could be used to monitor the acquisition of student skills at both the technical and intellectual levels or to ensure that students properly perform enough procedures to be able to have their own private practice. Moreover, HISs in dentistry can solve problems that are specific to academic research centers. For instance, they can reduce waiting times linked to teachers’ successive validations of the different stages of treatment performed by students [72].

In conclusion, there is a need for greater development of dental informatics in the field of HISs and for further studies on their educational value.

Availability of data and materials

Not applicable.

Change history

Abbreviations

HIS:

Health information system

EDR:

Electronic dental record

SCCS:

Standardized clinical coding system

ICD:

International Classification of Diseases

CDSS:

Clinical decision support system

PRISMA-ScR:

Preferred Reporting Items for Systematic Reviews and Meta-Analyses extension for Scoping Reviews

OBO:

Open biomedical ontologies

DDS:

Dental diagnostic system

HSRT:

Health sciences reasoning test

DPBRN:

Dental practice-based research network

DxTM:

Standardized diagnostic terminology

References

  1. Zimmerman JL, Ball MJ, Petroski SP. Computers in dentistry. Dent Clin North Am. 1986;30(4):739–43.

    Article  PubMed  Google Scholar 

  2. Lam R, Kruger E, Tennant M. How a modified approach to dental coding can benefit personal and professional development with improved clinical outcomes. J Evid Based Dent Pract. 2014;14(4):174–82.

    Article  PubMed  Google Scholar 

  3. Chhabra KG, Mulla SH, Deolia SG, Chhabra C, Singh J, Marwaha BS. Dental informatics in India: time to embrace the change. J Clin Diagn Res. 2016;10(3):ZE12–5.

  4. Islam MM, Poly TN, Li YJ. Recent advancement of clinical information systems: opportunities and challenges. Yearb Med Inform. 2018;27(1):83–90.

    Article  PubMed  PubMed Central  Google Scholar 

  5. Schleyer TK. Dental informatics: a work in progress. Adv Dent Res. 2003;17:9–15.

    Article  PubMed  Google Scholar 

  6. Jokstad A. Computer-assisted technologies used in oral rehabilitation and the clinical documentation of alleged advantages - a systematic review. J Oral Rehabil. 2017;44(4):261–90.

    Article  PubMed  Google Scholar 

  7. Masic F. Information systems in dentistry. Acta Inform Med. 2012;20(1):47–55.

    Article  PubMed  PubMed Central  Google Scholar 

  8. Acharya A, Schroeder D, Schwei K, Chyou PH. Update on electronic dental record and clinical computing adoption among dental practices in the United States. Clin Med Res. 2017;15(3–4):59–74.

    Article  PubMed  PubMed Central  Google Scholar 

  9. Chauhan Z, Samarah M, Unertl KM, Jones MW. Adoption of electronic dental records: examining the influence of practice characteristics on adoption in one state. Appl Clin Inform. 2018;9(3):635–45.

    Article  PubMed  PubMed Central  Google Scholar 

  10. Tokede O, Ramoni RB, Patton M, Da Silva JD, Kalenderian E. Clinical documentation of dental care in an era of electronic health record use. J Evid Based Dent Pract. 2016;16(3):154–60.

    Article  PubMed  PubMed Central  Google Scholar 

  11. Sidek YH, Martins JT. Perceived critical success factors of electronic health record system implementation in a dental clinic context: An organisational management perspective. Int J Med Inform. 2017;107:88–100.

    Article  PubMed  Google Scholar 

  12. Wongsapai M, Suebnukarn S, Rajchagool S, Beach D, Kawaguchi S. Health-oriented electronic oral health record: development and evaluation. Health Inform J. 2014;20(2):104–17.

    Article  Google Scholar 

  13. Atkinson JC, Zeller GG, Shah C. Electronic patient records for dental school clinics: more than paperless systems. J Dent Educ. 2002;66(5):634–42.

    Article  PubMed  Google Scholar 

  14. Ramoni RB, Etolue J, Tokede O, McClellan L, Simmons K, Yansane A, et al. Adoption of dental innovations: the case of a standardized dental diagnostic terminology. J Am Dent Assoc. 2017;148(5):319–27.

    Article  PubMed  PubMed Central  Google Scholar 

  15. Hovenga EJ, Grain H. Health information governance in a digital environment. Amsterdam: IOS Press; 2013.

    Google Scholar 

  16. World Health Organization, editor. International classification of diseases ninth revision. Geneva: World Health Organization ; obtainable from WHO Publications Centre]; 1978. 331 p.

  17. Kalenderian E, Ramoni RL, White JM, Schoonheim-Klein ME, Stark PC, Kimmes NS, et al. The development of a dental diagnostic terminology. J Dent Educ. 2011;75(1):68–76.

    Article  PubMed  PubMed Central  Google Scholar 

  18. White JM, Kalenderian E, Stark PC, Ramoni RL, Vaderhobli R, Walji MF. Evaluating a dental diagnostic terminology in an electronic health record. J Dent Educ. 2011;75(5):605–15.

    Article  PubMed  PubMed Central  Google Scholar 

  19. Smith B, Goldberg LJ, Ruttenberg A, Glick M. Ontology and the future of dental research informatics. J Am Dent Assoc. 2010;141(10):1173–5.

    Article  PubMed  PubMed Central  Google Scholar 

  20. Schleyer TK, Ruttenberg A, Duncan W, Haendel M, Torniai C, Acharya A, et al. An ontology-based method for secondary use of electronic dental record data. AMIA Jt Summits Transl Sci Proc. 2013;2013:234–8.

    PubMed  PubMed Central  Google Scholar 

  21. Weiskopf NG, Weng C. Methods and dimensions of electronic health record data quality assessment: enabling reuse for clinical research. J Am Med Inform Assoc. 2013;20(1):144–51.

    Article  PubMed  PubMed Central  Google Scholar 

  22. Faujdar DS, Sahay S, Singh T, Jindal H, Kumar R. Public health information systems for primary health care in India: a situational analysis study. J Family Med Prim Care. 2019;8(11):3640–6.

    Article  PubMed  PubMed Central  Google Scholar 

  23. Leake JL. Diagnostic codes in dentistry–definition, utility and developments to date. J Can Dent Assoc. 2002;68(7):403–6.

    PubMed  Google Scholar 

  24. Goldberg LJ, Ceusters W, Eisner J, Smith B. The significance of SNODENT. Stud Health Technol Inform. 2005;116:737–42.

    PubMed  Google Scholar 

  25. Tokede O, White J, Stark PC, Vaderhobli R, Walji MF, Ramoni R, et al. Assessing use of a standardized dental diagnostic terminology in an electronic health record. J Dent Educ. 2013;77(1):24–36.

    Article  PubMed  PubMed Central  Google Scholar 

  26. Reed SG, Adibi SS, Coover M, Gellin RG, Wahlquist AE, AbdulRahiman A, et al. Does use of an electronic health record with dental diagnostic system terminology promote dental students’ critical thinking? J Dent Educ. 2015;79(6):686–96.

    Article  PubMed  PubMed Central  Google Scholar 

  27. Ramoni RB, Walji MF, Kim S, Tokede O, McClellan L, Simmons K, et al. Attitudes and beliefs toward the use of a dental diagnostic terminology A survey of dental providers in a dental practice. J Am Dent Assoc. 2015;146(6):390–7.

    Article  PubMed  PubMed Central  Google Scholar 

  28. Sutton JC, Fay RM, Huynh CP, Johnson CD, Zhu L, Quock RL. Dental faculty accuracy when using diagnostic codes: a pilot study. J Dent Educ. 2017;81(5):554–60.

    Article  PubMed  Google Scholar 

  29. Obadan-Udoh E, Simon L, Etolue J, Tokede O, White J, Spallek H, et al. Dental providers’ perspectives on diagnosis-driven dentistry: strategies to enhance adoption of dental diagnostic terminology. Int J Environ Res Public Health. 2017;14(7):767.

    Article  PubMed Central  Google Scholar 

  30. Yansane A, Tokede O, White J, Etolue J, McClellan L, Walji M, et al. Utilization and validity of the dental diagnostic system over time in academic and private practice. JDR Clin Trans Res. 2019;4(2):143–50.

    PubMed  Google Scholar 

  31. Taylor HL, Siddiqui Z, Frazier K, Thyvalikakath T. Evaluation of a dental diagnostic terminology subset. Stud Health Technol Inform. 2019;264:1602.

    PubMed  PubMed Central  Google Scholar 

  32. Kalenderian E, Ramoni RB, Walji MF. Standardized dental diagnostic terminology. Ann Dent Oral Health. 2018;1:1002.

    Article  Google Scholar 

  33. Schleyer TK, Ruttenberg A. Oral health and disease ontologies [Internet]. Github. 2018 [cited 2021 Feb 7]. https://github.com/oral-health-and-disease-ontologies/ohd-ontology.

  34. Chadwick BL, Oliver RG, Gyton J. Validation of undergraduate clinical data by electronic capture (barcode). Med Teach. 2002;24(2):193–6.

    Article  PubMed  Google Scholar 

  35. Thyvalikakath TP, Monaco V, Thambuganipalle HB, Schleyer T. A usability evaluation of four commercial dental computer-based patient record systems. J Am Dent Assoc. 2008;139(12):1632–42.

    Article  PubMed  PubMed Central  Google Scholar 

  36. Irwin JY, Harkema H, Christensen LM, Schleyer T, Haug PJ, Chapman WW. Methodology to develop and evaluate a semantic representation for NLP. AMIA Annu Symp Proc. 2009;2009:271–5.

    PubMed  PubMed Central  Google Scholar 

  37. Hippmann R, Dostálová T, Zvárová J, Nagy M, Seydlova M, Hanzlícek P, et al. Voice-supported electronic health record for temporomandibular joint disorders. Methods Inf Med. 2010;49(2):168–72.

    Article  PubMed  Google Scholar 

  38. Hill HK, Stewart DC, Ash JS. Health Information Technology Systems profoundly impact users: a case study in a dental school. J Dent Educ. 2010;74(4):434–45.

    Article  PubMed  Google Scholar 

  39. Walji MF, Kalenderian E, Tran D, Kookal KK, Nguyen V, Tokede O, et al. Detection and characterization of usability problems in structured data entry interfaces in dentistry. Int J Med Inform. 2013;82(2):128–38.

    Article  PubMed  Google Scholar 

  40. Tancredi W, Torgersson O. An example of an application of the semiotic inspection method in the domain of computerized patient record system. Stud Health Technol Inform. 2013;192:471–5.

    PubMed  Google Scholar 

  41. Noureldin M, Mosallam R, Hassan SZ. Quality of documentation of electronic medical information systems at primary health care units in Alexandria. Egypt East Mediterr Health J. 2014;20(2):105–11.

    Article  PubMed  Google Scholar 

  42. Walji MF, Kalenderian E, Piotrowski M, Tran D, Kookal KK, Tokede O, et al. Are three methods better than one? A comparative assessment of usability evaluation methods in an EHR. Int J Med Inform. 2014;83(5):361–7.

    Article  PubMed  PubMed Central  Google Scholar 

  43. Thyvalikakath TP, Dziabiak MP, Johnson R, Torres-Urquidy MH, Acharya A, Yabes J, et al. Advancing cognitive engineering methods to support user interface design for electronic health records. Int J Med Inform. 2014;83(4):292–302.

    Article  PubMed  PubMed Central  Google Scholar 

  44. Schwei KM, Cooper R, Mahnke AN, Ye Z, Acharya A. Exploring dental providers’ workflow in an electronic dental record environment. Appl Clin Inform. 2016;7(2):516–33.

    Article  PubMed  PubMed Central  Google Scholar 

  45. Thierer TE, Delander KA. Improving documentation, compliance, and approvals in an electronic dental record at a U.S. dental school. J Dent Educ. 2017;81(4):442–9.

    Article  PubMed  Google Scholar 

  46. Mishra R, Burke A, Gitman B, Verma P, Engelstad M, Haendel MA, et al. Data-driven method to enhance craniofacial and oral phenotype vocabularies. J Am Dent Assoc. 2019;150(11):933-939.e2.

    Article  PubMed  PubMed Central  Google Scholar 

  47. Mendonça EA. Clinical decision support systems: perspectives in dentistry. J Dent Educ. 2004;68(6):589–97.

    Article  PubMed  Google Scholar 

  48. Khanna S. Artificial intelligence: contemporary applications and future compass. Int Dent J. 2010;60(4):269–72.

    PubMed  Google Scholar 

  49. Fricton J, Rindal DB, Rush W, Flottemesch T, Vazquez G, Thoele MJ, et al. The effect of electronic health records on the use of clinical care guidelines for patients with medically complex conditions. J Am Dent Assoc. 2011;142(10):1133–42.

    Article  PubMed  Google Scholar 

  50. Chen Q, Wu J, Li S, Lyu P, Wang Y, Li M. An ontology-driven, case-based clinical decision support model for removable partial denture design. Sci Rep. 2016;6:1–8.

    Google Scholar 

  51. Sayed ME. Effectiveness of clinical decision support systems for the survival of natural teeth: a community guide systematic review. Int J Prosthodont. 2019;32(4):333–8.

    Article  PubMed  Google Scholar 

  52. Machoy ME, Szyszka-Sommerfeld L, Vegh A, Gedrange T, Woźniak K. The ways of using machine learning in dentistry. Adv Clin Exp Med. 2020;29(3):375–84.

    Article  PubMed  Google Scholar 

  53. Hunt RJ, Ojha D. Oral health care quality measurement and its role in dental education. J Dent Educ. 2017;81(12):1395–404.

    Article  PubMed  Google Scholar 

  54. Righolt AJ, Sidorenkov G, Faggion CM, Listl S, Duijster D. Quality measures for dental care: a systematic review. Community Dent Oral Epidemiol. 2019;47(1):12–23.

    Article  PubMed  Google Scholar 

  55. Obadan-Udoh EM, Calvo JM, Panwar S, Simmons K, White JM, Walji MF, et al. Unintended consequences and challenges of quality measurements in dentistry. BMC Oral Health. 2019;19:1–5.

    Article  Google Scholar 

  56. Byrne MJ, Tickle M, Glenny A-M, Campbell S, Goodwin T, O’Malley L. A systematic review of quality measures used in primary care dentistry. Int Dent J. 2019;69(4):252–64.

    Article  PubMed  Google Scholar 

  57. Song M, Liu K, Abromitis R, Schleyer TL. Reusing electronic patient data for dental clinical research: a review of current status. J Dent. 2013;41(12):1148–63.

    Article  PubMed  PubMed Central  Google Scholar 

  58. Walji MF, Kalenderian E, Stark PC, White JM, Kookal KK, Phan D, et al. BigMouth: a multi-institutional dental data repository. J Am Med Inform Assoc. 2014;21(6):1136–40.

    Article  PubMed  PubMed Central  Google Scholar 

  59. Umar H. Capabilities of computerized clinical decision support systems: the implications for the practicing dental professional. J Contemp Dent Pract. 2002;3(1):27–42.

    Article  PubMed  Google Scholar 

  60. Rindal DB, Rush WA, Schleyer TK, Kirshner M, Boyle RG, Thoele MJ, et al. Computer-assisted guidance for dental office tobacco-cessation counseling: a randomized controlled trial. Am J Prev Med. 2013;44(3):260–4.

    Article  PubMed  PubMed Central  Google Scholar 

  61. Bhardwaj A, Ramoni R, Kalenderian E, Neumann A, Hebballi NB, White JM, et al. Measuring up: implementing a dental quality measure in the electronic health record context. J Am Dent Assoc. 2016;147(1):35–40.

    Article  PubMed  Google Scholar 

  62. Neumann A, Kalenderian E, Ramoni R, Yansane A, Tokede B, Etolue J, et al. Evaluating quality of dental care among patients with diabetes. J Am Dent Assoc. 2017;148(9):634-643.e1.

    Article  PubMed  PubMed Central  Google Scholar 

  63. BigMouth Dental Data Repository. Frequently asked questions [Internet]. University of Texas. 2021 [cited 2021 May 11]. https://bigmouth.uth.edu/FAQDoc.php.

  64. World Health Organization. Application of the international classification of diseases to dentistry and stomatology. 3rd ed. Geneva: World Health Organization; 1994.

    Google Scholar 

  65. NCBO BioPortal. SNOMED CT summary [Internet]. BioPortal. 2020 [cited 2020 Sep 4]. http://bioportal.bioontology.org/ontologies/SNOMEDCT.

  66. Duncan WD, Thyvalikakath T, Haendel M, Torniai C, Hernandez P, Song M, et al. Structuring, reuse and analysis of electronic dental data using the Oral Health and Disease Ontology. J Biomed Semant. 2020;11(1):8.

    Article  Google Scholar 

  67. Snyman S, Pressentin KBV, Clarke M. International Classification of Functioning, Disability and Health: catalyst for interprofessional education and collaborative practice. J Interprof Care. 2015;29(4):313–9.

    Article  PubMed  Google Scholar 

  68. Schleyer T, Song M, Gilbert GH, Rindal DB, Fellows JL, Gordan VV, et al. Electronic dental record use and clinical information management patterns among practitioner-investigators in The Dental Practice-Based Research Network. J Am Dent Assoc. 2013;144(1):49–58.

    Article  PubMed  PubMed Central  Google Scholar 

  69. Thyvalikakath TP, Duncan WD, Siddiqui Z, LaPradd M, Eckert G, Schleyer T, et al. Leveraging electronic dental record data for clinical research in the national dental PBRN practices. Appl Clin Inform. 2020;11(2):305–14.

    Article  PubMed  PubMed Central  Google Scholar 

  70. Fellows JL, Rindal DB, Barasch A, Gullion CM, Rush W, Pihlstrom DJ, et al. ONJ in two dental practice-based research network regions. J Dent Res. 2011;90(4):433–8.

    Article  PubMed  PubMed Central  Google Scholar 

  71. Stark PC, Kalenderian E, White JM, Walji MF, Stewart DC, Kimmes N, et al. Consortium for Oral health-related informatics: improving dental research, education, and treatment. J Dent Educ. 2010;74(10):1051–65.

    Article  PubMed  PubMed Central  Google Scholar 

  72. Virun V, Singhe Dhaliwal G, Liu CJ, Sharma P, Kaur H, Nalliah RP. Improving efficiency in dental school clinics by computerizing a manual task. Dent J (Basel). 2019;7(2):44.

    Article  Google Scholar 

Download references

Acknowledgements

Not applicable.

Funding

None.

Author information

Authors and Affiliations

Authors

Contributions

All authors (BB, FB, and JD) contributed equally to the preparation and the review. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Ballester Benoit.

Ethics declarations

Ethics approval and consent to participate

Not applicable.

Consent for publication

Not applicable.

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Additional file 1

. Minimum Clinical Documentation Checklist by Tokede et al.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Benoit, B., Frédéric, B. & Jean-Charles, D. Current state of dental informatics in the field of health information systems: a scoping review. BMC Oral Health 22, 131 (2022). https://0-doi-org.brum.beds.ac.uk/10.1186/s12903-022-02163-9

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://0-doi-org.brum.beds.ac.uk/10.1186/s12903-022-02163-9

Keywords