Original Research

AI = Appropriate Insight? ChatGPT Appropriately Answers Parents’ Questions for Common Pediatric Orthopaedic Conditions

Natalie L. Zusman, MD; Matthew Bauer, MD; Jennah Mann, MD, FRCSC, MHCDS; Rachel Y. Goldstein, MD, MPH, FAAOS

Jackie and Gene Autry Orthopedic Center, Children’s Hospital Los Angeles, Los Angeles, CA

Correspondence: Natalie L. Zusman, MD, Jackie and Gene Autry Orthopedic Center, Children’s Hospital Los Angeles, Mail Code MS069, 4650 W. Sunset Blvd., Los Angeles, CA 90027. E-mail: [email protected]

Received: July 12, 2023; Accepted: September 6, 2023; Published: November 15, 2023

DOI: 10.55275/JPOSNA-2023-762

Volume 5, Number 4, November 2023


Background: Artificial intelligence services, such as ChatGPT (generative pre-trained transformer), can provide parents with tailored responses to their pediatric orthopaedic concerns. We undertook a qualitative study to assess the accuracy of the answer provided by ChatGPT in comparison to OrthoKids (“OK”), a patient-facing educational platform governed by the Pediatric Orthopaedic Society of North America (POSNA) for common pediatric orthopaedic conditions.

Methods: A cross-sectional study was performed from May 26 to June 18, 2023. OK website (orthokids.org) was reviewed and 30 existing questions were collected. The corresponding OK and ChatGPT responses were recorded. Two pediatric orthopaedic surgeons assessed the answer provided by ChatGPT against the OK response. Answers were graded as: AGREE (accurate information; question addressed in full), NEUTRAL (accurate information; question not answered), DISAGREE (information was inaccurate or could be detrimental to patients’ health). The evaluators’ responses were compiled; discrepancies were adjudicated by a third pediatric orthopaedist. Additional chatbot answer characteristics such as unprompted treatment recommendations, bias, and referral to a healthcare provider were recorded. Data was analyzed using descriptive statistics.

Results: The chatbot’s answers were agreed upon in 93% of questions. Two responses were felt to be neutral. No responses met disagreement. Unprompted treatment recommendations were included in 55% of its responses (excluding treatment-specific questions). The chatbot encouraged users to “consult with a healthcare professional” in all responses. It was nearly an equal split between recommending a generic provider (46%) in contrast to specifically stating a pediatric orthopaedist (54%). The chatbot was inconsistent in related topics in its provider recommendations, such as recommending a pediatric orthopaedist in 3 of 5 spine conditions.

Conclusion: Questions pertaining to common pediatric orthopaedic conditions were accurately represented by a chatbot in comparison to a specialty society-governed website. The knowledge that chatbots deliver appropriate responses is reassuring. However, the chatbot frequently offered unsolicited treatment recommendations whilst simultaneously inconsistently recommending an orthopaedic consultation. We urge caution to parents utilizing artificial intelligence without also consulting a healthcare professional.

Level of Evidence: IV

Key Concepts

  • Artificial intelligence chatbots are becoming increasingly popular, as demonstrated by the rapid rise of publications on the topic in the last 3 months, and they represent a novel patient education online platform.
  • In comparing 30 common pediatric orthopaedic conditions, >90% of the chatbot’s responses were felt to be in agreement with a specialty society’s parent-patient-facing education platform.
  • The chatbot’s responses were largely unbiased and referred patients to a healthcare professional. However, the responses lacked references or citing sources for the provided information.


An important aspect of providing orthopaedic care to children is addressing parental questions and concerns. Parents have historically resorted to online resources, such as institutional websites, Google searches, and YouTube.1,2 OrthoKids (orthokids.org) is a website governed by the Pediatric Orthopaedic Society of North America (POSNA), focused on providing information to parents and patients regarding common pediatric orthopaedic conditions. This parent-patient-facing website is governed and maintained by a voluntary group of pediatric orthopaedic surgeons who are members of POSNA. These surgeons are responsible for the diverse topics available to the public as well as the integrity of the information. The framework of OrthoKids is that of human experts volunteering time to provide parents and patients with information, which differs dramatically from new sources of internet information. Recent technological advances now provide the opportunity for parents to ask specific questions to online artificial intelligence (AI) chatbots, such as ChatGPT (generative pre-trained transformer, chat.openai.com/), which was released in November 2022.3 Within the past several months, there have been several publications assessing the chatbot’s responses in comparison to physician, resident, and test-bank answers.48 As an example, a study published in the Journal of the American Medical Association (JAMA) in March 2023 investigated the appropriateness of a chatbot’s response to 25 questions on cardiovascular disease prevention recommendations, as judged by preventive cardiologists. The cardiologists’ determined the AI-generated response to be largely appropriate.8 Chatbots represent an easy and expeditious way for patients to acquire information and are becoming increasingly popular. It is therefore prudent to evaluate the potential role of AI chatbots in pediatric orthopaedics. Given the rising popularity of chatbots, we undertook this study to assess the accuracy of the chatbot’s responses as compared to the physician-reviewed information available on the OrthoKids website. Secondary aims of the study were to investigate the frequency of the chatbot response to include (1) suggesting patient’s seek consultation with a healthcare provider and (2) unprompted treatment recommendations.

Materials and Methods

Institutional review board approval was not sought, as no patient data or protected healthcare identifiers were utilized, and the online resources utilized for the study are all open access. We undertook a cross-sectional study employing free online patient educational materials. The OrthoKids website was selected to represent the standard by which the chatbot could be compared. Specific conditions were reviewed from OrthoKids, and 30 topics were selected. The Frequently Asked Questions (FAQs) page of the common conditions and the Dear Doctor forum were queried to create 30 questions reflective of various topics in pediatric orthopaedics. Question styles ranged from simple and direct (“what does spondylolysis mean?”) to brief vignettes (“my three-year-old daughter has developmental hip dysplasia and walks with a limp. What is the recommended treatment?”) See Table 1 for a complete list of questions. The responses from the OrthoKids website were documented. On May 30, 2023, ChatGPT was asked 30 questions and the website’s responses were recorded. The topic, question, chatbot, and OrthoKids responses were reviewed by two pediatric orthopaedists. Answers were graded as: AGREE (information was accurate; question addressed in full), NEUTRAL (information was accurate; question not answered), DISAGREE (information was inaccurate or response could be detrimental to patient’s health).

Table 1. Thirty Common Pediatric Orthopaedic Conditions and Questions Posed to ChatGPT

Number Topic Question OrthoKids Website URL
1 Adolescent Idiopathic Scoliosis How to treat adolescent idiopathic scoliosis of 46-degree curve? https://orthokids.org/conditions/scoliosis/
2 Arthrogryposis Will my child with arthrogryposis walk? https://orthokids.org/conditions/arthrogryposis-multiplex-congenita-(amc)/
3 Back Pain in Kids Should I worry if my adolescent son has back pain? https://orthokids.org/conditions/back-pain-in-children/
4 Bowlegs Will my 13-month-old son with bowlegs grow out of it? https://orthokids.org/conditions/bowed-legs-knock-knees/
5 Cerebral Palsy When should I take my child to the doctor to be evaluated for cerebral palsy? https://orthokids.org/conditions/cerebral-palsy/
6 Charcot Marie Tooth What is CMT? https://orthokids.org/conditions/charcot-marie-tooth/
7 Clubfoot Will my baby with clubfeet walk at a normal age? https://orthokids.org/conditions/clubfoot/
8 Developmental Hip Dysplasia My three-year-old daughter has developmental hip dysplasia and walks with a limp. What is the recommended treatment? https://orthokids.org/conditions/developmental-dysplasia-of-the-hip-(ddh)/
9 Early Onset Scoliosis My 6-year-old doctor was just diagnosed with scoliosis with a 25% curve and was told to wear a brace. Is a brace necessary? https://orthokids.org/conditions/early-onset-scoliosis-(eos)/
10 Flexible Flat Foot Does my child need special shoes for flexible flatfeet? https://orthokids.org/conditions/flexible-flat-feet/
11 In-Toeing When should I take my child to the doctor for in-toeing? https://orthokids.org/conditions/in-toeing/
12 Juvenile Arthritis What is the treatment for juvenile arthritis? https://orthokids.org/conditions/juvenile-arthritis/
13 Knock Knees My four-year-old son has knock knees. Should I worry? https://orthokids.org/dear-doctor/questions/
14 Leg Length Discrepancy What are the consequences of a leg length difference of 2.5 cm in a child? https://orthokids.org/conditions/leg-length-discrepancy/
15 Musculoskeletal Infection What is the treatment for a bone infection in a child? https://orthokids.org/conditions/musculoskeletal-infections/
16 Neonatal Brachial Plexus Birth Palsy How is neonatal brachial plexus birth palsy treated? https://www.orthokids.org/conditions/brachial-plexus-palsy/
17 Nonossifying Fibroma I was wondering if I can get more information about non-ossifying fibroma https://orthokids.org/conditions/non-ossifying-fibroma/
18 Osgood Schlatter When does knee pain due to Osgood-Schlatter’s Disease stop? https://orthokids.org/conditions/osgood-schlatter-s-disease/
19 Osteochondroma I have MHE osteochondroma. What is the possibility my kids will get this disease from me? https://orthokids.org/conditions/osteochondroma/
20 Osteogenesis Imperfecta What causes Brittle Bone Disease? https://orthokids.org/conditions/osteogenesis-imperfecta/
21 Perthes Should I worry if my first-grade son has a painless limp? https://orthokids.org/conditions/perthes-disease/
22 Rickets What causes rickets in children? https://orthokids.org/conditions/rickets/
23 Slipped Capital Femoral Epiphysis My son had a SCFE. What is the likelihood it will happen on the other hip? https://orthokids.org/conditions/slipped-capital-femoral-epiphysis-(scfe)/
24 Spina Bifida My son was born with spina bifida. What doctors does he need? https://orthokids.org/conditions/spina-bifida/
25 Spondylolysis What does spondylolysis mean? https://orthokids.org/conditions/spondylolysis-spondylolisthesis/
26 Tarsal Coalition What is a tarsal coalition? https://orthokids.org/conditions/tarsal-coalitions/
27 Toe Walking What is the treatment for walking on tippy toes? https://orthokids.org/conditions/toe-walking/
28 Torticollis Why does my baby tilt her head to only one side? https://orthokids.org/conditions/torticollis/
29 Unicameral Bone Cyst My grandson broke his arm, and it turns out he has a unicameral bone cyst. Is this dangerous? https://orthokids.org/conditions/unicameral-bone-cyst-(ubc)/
30 Vertical Talus Should I worry my baby was born with rocker bottom feet? https://orthokids.org/conditions/vertical-talus/

The evaluators’ responses were compiled; discrepancies were adjudicated by a third pediatric orthopaedist. Furthermore, the frequency of unprompted treatment recommendations (e.g., response includes treatment options when the question stem is, “what does spondylolysis mean?”) was recorded. Chatbot responses referring respondents to a healthcare professional as well as the type of healthcare professional (nonspecific versus specialist) were recorded. Terms suggestive of bias, such as a healthcare institution name or industry, were also collected. Descriptive statistics were performed using Microsoft Excel, version 16.


Of the 30 questions, chatbot’s answers were agreed upon (e.g., appropriate and the question answered in full) in 93% (n=28, Table 1). Two questions were felt to be neutral, as the chatbot response failed to address the question in full. As an example, the chatbot did not specify that a three-year-old with developmental hip dysplasia would likely require an open hip reduction (graded as neutral), whereas it did confirm that a six-year-old with a spine curvature of 25 degrees would require bracing (graded as agree). No responses were graded as disagree.

After excluding questions regarding treatment recommendations (n=8), the chatbot included unprompted treatment recommendations in 55% of its responses (n=12). An abbreviated example response to, “what is a tarsal coalition,” referred to activity modification, orthotics, physical therapy, medications, and surgery (see Table 2 for a similar example using Slipped Capital Femoral Epiphysis). Another example included a discussion of therapy, orthotics, and orthopaedic interventions when asked, “what is CMT [Charcot Marie Tooth]?” Some topics which did not elicit unprompted treatment recommendations were adolescent back pain, cerebral palsy, and bowlegs in a toddler.

Table 2. Sample Response from ChatGPT and OrthoKids on the Topic of Slipped Capital Femoral Epiphysis

“My son had a Slipped Capital Femoral Epiphysis. What is the likelihood it will happen on the other hip?”
  • A slipped capital femoral epiphysis (SCFE) is a condition where the ball-shaped head of the thigh bone slips off the neck of the bone at the hip joint. It typically occurs during periods of rapid growth, most commonly in children and adolescents. When one hip is affected by SCFE, there is a risk that the condition may also develop in the other hip. The likelihood of SCFE occurring on the other hip can vary depending on several factors.
  • Here are some key points to consider:
  • Bilateral SCFE: Bilateral SCFE refers to the occurrence of SCFE in both hips. While it is less common than unilateral SCFE (affecting one hip), it is estimated that around 20-50% of individuals with SCFE on one side may develop it in the other hip as well.
  • Timing: The risk of developing SCFE in the other hip is highest within the first year after the initial diagnosis. Therefore, it is crucial to closely monitor and follow up with your healthcare professional during this period.
  • Factors influencing bilateral occurrence: Several factors can increase the likelihood of bilateral SCFE, including younger age at the time of the initial SCFE, increased severity of the initial slip, and certain underlying conditions or anatomical abnormalities that contribute to hip instability or weakness.
  • Regular follow-up: After experiencing SCFE in one hip, it is essential to continue regular follow-up appointments with a pediatric orthopaedic specialist. They will monitor the affected hip, evaluate the other hip for any signs of slip, and provide appropriate recommendations and interventions to minimize the risk of SCFE in the other hip.
  • Preventive measures: In some cases, if the initial slip is detected early and managed promptly, it may help reduce the risk of SCFE developing in the other hip. Treatment options for SCFE include surgical and non-surgical interventions, such as the use of pins or screws to stabilize the slipped femoral head.
  • It’s important to remember that each case is unique, and the likelihood of SCFE occurring in the other hip can vary. Regular monitoring, early detection, and appropriate management are key in minimizing the risk and ensuring the best possible outcomes for your son’s hip health.
  • Consulting with a pediatric orthopaedic specialist who is familiar with your son’s case will provide you with the most accurate information and individualized guidance based on his specific situation.
If someone has a SCFE on one side, there is a 1 in 4 chance that the other hip could develop this problem as well.

The chatbot encouraged users to “consult with a healthcare professional” in all 30 responses (100%). It was nearly an equal split between recommending a generic provider (46%) or specifying a pediatric orthopaedist (54%). See Table 2 for an example where a pediatric orthopaedist is specifically recommended. The chatbot was also inconsistent in its provider recommendations; pediatric orthopaedist was not explicitly recommended in the topics of adolescent idiopathic scoliosis or adolescent back pain but were included when discussing early-onset scoliosis, spina bifida, and spondylolysis. Inconsistent recommendations were also observed in the pediatric foot conditions. Orthopaedic consultation was referenced in the topics of toe-walking, tarsal coalition, Charcot-Marie-Tooth, and congenital vertical talus but not included in arthrogryposis, clubfoot, or flexible flat foot.

Bias was universally low amongst answers. There were no references to specific providers, corporations, healthcare institutions. The only answer with potential for bias was the chatbot’s inclusion of a Denis-Browne bar in a discussion of bracing after clubfoot casting. The Denis-Browne bar is a specific-named orthosis, whereas bias could have been avoided by referring to a “foot abduction orthosis.” No devices were named in discussions on bracing for adolescent idiopathic scoliosis, early-onset scoliosis, or abduction orthosis for developmental hip dysplasia (DDH). Specific brace names are mentioned on the OrthoKids webpage on adolescent idiopathic scoliosis as part of an inclusive video on bracing. Trademark devices are included in the Early Onset Scoliosis OrthoKids page to refer to patented surgical devices and procedures. OrthoKids does not include named orthoses when reviewing DDH or clubfoot.


The results of this investigation join the growing number of publications reviewing the rapidly expanding role of AI in modern healthcare. All three study aims were accomplished: determining the proportion of appropriate chatbot responses, observing the referral to a healthcare provider, and noting the frequency of unprompted treatment recommendations. These findings thereby prompt questions regarding other internet sources of information, e.g., online videos, thoughts regarding why parents may utilize non-specialty specific websites, in addition to concerns regarding the quality of information presented.

Our findings concurred with previously published reports regarding the appropriate nature of chatbot responses to healthcare questions, including Ayoub et al.7 and Sarraju et al.8 ChatGPT responses are frequently appropriate in their responses, ranging from 73-84% in the literature and slightly higher in our findings (93%).7,8 Increased confidence in AI-generated information has important implications for providers, given parents and patients frequently utilize online resources to expand their health literacy. As commented in prior publications on the role of AI in healthcare, chatbots could have the potential to improve parent and patient satisfaction in reinforcing new diagnoses, treatment recommendations, or even postoperative instructions.68

One aim of this investigation was to assess the appropriateness of the information presented in the chatbot’s responses, the importance of which is further emphasized by the great frequency of unprompted treatment recommendations. Ensuring the quality and lack of bias in the healthcare information publicly available is critical. Our findings demonstrate the quality to be high and the bias to be low; however, the chatbot did fail to include components that an educated reader may seek. In 1997, JAMA benchmark criteria were published with the specific intent to provide standards for publicly available online medical information.9 The four criteria include authorship, attribution (e.g., works cited), disclosure, and currency (e.g., publication date and updates).9 In regards to ChatGPT, none of the four criteria were met. There were no references provided to answers, and no authors accredited with the information presented. Failing to appropriately attribute data sources is concerning to healthcare providers, for reasons such as being unable to check or correct data within the chatbot answers. It is unclear if this is accidental plagiarism or perhaps is secondary to the AI system’s beta-testing state. Of note, the OrthoKids webpages do not include references or authorship but do include links to the same topic page on the AAOS webpage (orthoinfo.aaos.org/), which does include disclosure and date of update. Regardless, identifying a means of including references would improve the quality and reliability of the chatbot’s answers.

Parents frequently turn to the internet for healthcare information; therefore, it’s plausible they will also utilize AI as another online platform.1012 In a 2019 study on parental internet use in pediatric orthopaedics, 97% of survey respondents used a search engine for healthcare information and 28% utilized social media.10 Shortly thereafter, reports on the quality of YouTube videos on pediatric orthopaedic conditions (e.g., developmental hip dysplasia and clubfoot) began appearing in the literature.1,2 The YouTube studies reported on the high variability of information quality present in the videos, and the higher-quality videos were those posted from physicians.1,2 ChatGPT’s 100 million monthly users lags behind YouTube’s 2.5 billion monthly users, yet the AI platform’s rapid user growth highlights the importance of healthcare professionals critically assessing the information being generated to consumers.13,14

The current availability of online health information is expansive, including search engines, curated health platforms (e.g., WebMD, www.webmd.com/), institution webpages, medical videos, and specialty-governed websites (such as POSNA or the American Academy of Pediatrics [AAP]). Prior research has suggested parents infrequently utilize specialty-specific websites.10 The rapidly growing interest and concurrent rise in publications on chatbots’ potential role in healthcare introduces a new query: why might parents and patients select AI over a specialty governed website? Feghhi et al.15 in 2014 and Ó Doinn et al.15,16 in 2021 sought to investigate the specialty websites from POSNA, AAP, and the American Academy of Orthopaedic Surgeons (AAOS). The authors concluded that the information presented on these websites tended to be written higher than the recommended reading level for a layperson, had barriers to accessibility (such as needing to register to access a page), and tended towards long words and sentence structures.15,16 One concern raised was the quality of information failed to translate to comprehension, and thus, failed to affect health literacy.16 Our study did not assess the reading level of the chatbot responses, nor did we formally evaluate the accessibility and usability. The chatbot does limit accessibility in a similar manner to some of the specialty websites because it requires the user to register prior to asking a question. The chatbot’s usability is high, as the answer is presented in a simple essay style format (Table 2). Unlike the OrthoKids website, the chatbot answers did not incorporate images, diagrams, or videos, which might negatively impact a visual or auditory learner. Another apparent difference between the two responses is the response’s length. As demonstrated by the example question in Table 2, the ChatGPT answer was more robust than the OrthoKids response. Both responses accurately replied to the question, however, the OrthoKids response was 25 words in contrast to the chatbot’s 388-word answer. The clinical implication of the difference in response lengths to patient comprehension is unknown. There is the potential for future investigations to formally explore the readability of the chatbot’s answers as well as a study on parental preference for AI-generated responses versus information presented on specialty websites.

This investigation is the first of our knowledge evaluating ChatGPT answers appropriateness in compared to the OrthoKids specialty page. One strength of the study is the inclusion of 30 common pediatric orthopaedic conditions, meaning the topics which we anticipate the majority of parental questions arise. A prior publication asked parents regarding common pediatric orthopaedic conditions they researched on the internet, and we ensured these topics were included in our query.10 There are limitations to acknowledge. Creating our own grading scale posed the risk of selection bias, which we attempted to mitigate by critiquing published scales by similar articles and devising a simple three-component scale.6,8 Reviewers were not blinded in grading the responses, given the primary outcome was to assess the appropriateness of the chatbot to the POSNA information, which was treated as a proxy “gold standard answer,” acknowledging there is no true standard in subjective internet patient-facing educational platforms. Authors further attempted to limit the potential for sampling bias in the questions by referring to real-world examples. The FAQs page on OrthoKids was the most common source of questions, followed by the OrthoKids Dear Doctor forum. Utilizing pre-existing questions was felt to be protective against introducing bias in comparison to generating our own questions.

The findings of this cross-sectional study demonstrate the ChatGPT responses to common pediatric orthopaedic conditions are frequently appropriate, although the answers often contained unprompted treatment recommendations and inconsistently recommended consultation with a pediatric orthopaedist. Limitations to the chatbot included failure to meet JAMA benchmark criteria for quality online healthcare information and a lack of visual or auditory accompaniments to the answers. We ultimately do perceive the value of AI chatbots in providing parents and patients with easily accessible online health information; however, we urge great caution until quality information (such as attributing data sources) is routinely incorporated into responses.

Additional Links

AAOS Now articles on ChatGPT in orthopaedics:


No funding was received. The authors report no conflicts of interest related to this manuscript.


  1. Ranade AS, Belthur MV, Oka GA, et al. YouTube as an information source for clubfoot: a quality analysis of video content. J Pediatr Orthop B. 2020;29(4):375-378.
  2. Lock AM, Baker JF. Quality of YouTube videos for three common pediatric hip conditions: developmental hip dysplasia, slipped capital femoral epiphysis and Legg-Calve-Perthes disease. J Pediatr Orthop B. 2022;31(6):546-553.
  3. Introducing ChatGPT. Available at: https://openai.com/blog/chatgpt. Accessed June 15, 2023.
  4. Mihalache A, Popovic MM, Muni RH. Performance of an artificial intelligence chatbot in ophthalmic knowledge assessment. JAMA Ophthalmol. 2023;141(6):589-597.
  5. Reznik AM. A Test Drive of ChatGPT and Generate AI in Orthopaedics. AAOS Now. Available at: https://www.aaos.org/aaosnow/2023/may/managing/managing02/. Published May 10, 2023. Accessed June 3, 3023.
  6. Ayers JW, Poliak A, Dredze M, et al. Comparing physician and artificial intelligence chatbot responses to patient questions posted to a public social media forum. JAMA Intern Med. 2023;183(6):589-596.
  7. Ayoub NF, Lee YJ, Grimm D, et al. Comparison between ChatGPT and Google Search as sources of postoperative patient instructions. JAMA Otolaryngol Head Neck Surg. 2023;149(6):556-558.
  8. Sarraju A, Bruemmer D, Van Iterson E, et al. Appropriateness of cardiovascular disease prevention recommendations obtained from a popular online chat-based artificial intelligence model. J Am Med Assoc. 2023;329(10):842-844.
  9. Silberg WM. Assessing, controlling, and assuring the quality of medical information on the internet: Caveant Lector et viewor—let the reader and viewer beware. J Am Med Assoc. 1997;277(15):1244-1245.
  10. Peterlein CD, Bosch M, Timmesfeld N, et al. Parental internet search in the field of pediatric orthopedics. Eur J Pediatr. 2019;178(6):929-935.
  11. Yudianto B, Caldwell PH, Nanan R, et al. Patterns of parental online health information-seeking behaviour. J Paediatr Child Health. 2023;59(5):743-752.
  12. Baker JF, Devitt BM, Lynch S, et al. Internet use by parents of children attending a dedicated scoliosis outpatient clinic. Eur Spine J. 2012;21(10):1972-1977.
  13. Hu K. ChatGPT sets record for fastest-growing user base – analyst note. Reuters. Available at: https://www.reuters.com/technology/chatgpt-sets-record-fastest-growing-user-base-analyst-note-2023-02-01/. Published February 2, 2023. Accessed June 3, 2023.
  14. Aslam S. YouTube by the numbers: stats, demographics & fun facts. Omnicore.
  15. Feghhi DP, Komlos D, Agarwal N, et al. Quality of online pediatric orthopaedic education materials. J Bone Joint Surg Am. 2014;96(23):e194.
  16. Ó Doinn T, Broderick JM, Abdelhalim MM, et al. Readability of patient educational materials in pediatric orthopaedics. J Bone Joint Surg Am. 2021;103(12):e47.