Acessibilidade / Reportar erro

The Lack of High-quality Educational Resources about Adhesive Capsulitis on YouTube

Abstract

Objective

The advent of the Internet has provided new, easily accessible resources for patients seeking additional health information. Many doctors and healthcare organizations post informative videos on this platform, and nearly all patients are looking for videos online for a second opinion.

Methods

The phrases "frozen shoulder," "frozen shoulder treatment," "adhesive" capsulitis, and "adhesive capsulitis treatment" were entered into YouTube's search bar for a normal inquiry. The informativeness and overall quality of the adhesive capsulitis videos were rated using three separate scales.

Results

The mean and standard deviation values of the scoring systems were JAMA 1.25 ± 0.51, DISCERN 39.4 ± 13.4, GQS 2.83 ± 0.96 and ACSS 7.43 ± 4.86, respectively. Number of views, rate of views, and likes all had a positive correlation with Global Quality Score (GQS), as did DISCERN and ACSS. There was no statistically significant difference between the median JAMA, GQS score and Discern Criteria values according to the video source/uploader (p > 0.05).

Conclusion

YouTube videos on adhesive capsulitis, thus, need to be of higher quality, reliability, and instructive quality. There is a need for reliable videos about adhesive capsulitis, with instructional and high-quality cited.

Keywords
adhesive capsulitis; bursitis; video recording; social media; internet

Resumo

Objetivo

O advento da Internet proporcionou recursos novos e de fácil acesso para pacientes que procuram mais informações sobre saúde. Muitos médicos e organizações de saúde publicam vídeos informativos nesta plataforma e quase todos os pacientes procuram tais vídeos online para uma segunda opinião.

Métodos

As frases “frozen shoulder (ombro congelado)”, “frozen shoulder treatment (tratamento de ombro congelado)”, “adhesive capsulitis (capsulite adesiva)” e “adhesive capsulitis treatment (tratamento de capsulite adesiva)” foram inseridas na barra de pesquisa do YouTube para uma consulta normal. A informatividade e a qualidade geral dos vídeos sobre capsulite adesiva foram avaliadas usando três escalas distintas.

Resultados

Os valores de média e desvio padrão dos sistemas de pontuação do Journal of the American Medical Association (JAMA) foram 1,25 ± 0,51, DISCERN, 39,4 ± 13,4, Global Quality Score (GQS, Índice de Qualidade Global em português) 2,83 ± 0,96 e Adhesive Capsulitis Specific Score (ACSS, Escore Específico de Capsulite Adesiva em português), 7,43 ± 4,86, respectivamente. O número de visualizações, a taxa de visualizações e as curtidas tiveram uma correlação positiva com GQS, DISCERN e ACSS. Não houve diferença estatisticamente significativa entre os valores medianos de JAMA, GQS e DISCERN de acordo com a fonte/carregador do vídeo (p > 0,05).

Conclusão

Os vídeos do YouTube sobre capsulite adesiva precisam ter maior qualidade, confiabilidade e qualidade instrutiva. Há necessidade de vídeos confiáveis sobre capsulite adesiva, com citações instrutivas e de alta qualidade.

Palavras-chave
capsulite adesiva; bursite; gravação em vídeo; redes sociais; internet

Introduction

The advent of the Internet has provided new, easily accessible resources for patients seeking additional health information.11 Richardson MA, Park W, Bernstein DN, Mesfin A. Analysis of the quality, reliability, and educational content of YouTube videos concerning spine tumors. Int J Spine Surg 2022;16(02):278–282 When it comes to broad Internet searches, YouTube is just second to Google in popularity. Patients, however, are becoming more inclined to it as a means of learning about available healthcare options.22 O’Leary B, Saker C, StammMA, Mulcahey MK. YouTube videos lack efficacy as a patient education tool for rehabilitation and return to play following medial patellofemoral ligament reconstruction. Arthrosc Sports Med Rehabil 2022;4(03):e1111–e1118 Many doctors and healthcare organizations post informative videos on this platform, and nearly all patients are looking for videos online for a second opinion. YouTube is not a peer-reviewed platform, thus this development raises questions about the reliability of the information presented in its medical-related videos.33 Umur L, Sürücü S Are YouTube videos a sufficient resource for informing patients in the treatment of rotator cuff tears? J Health Sci Med 2022;5(01):99–103 Adhesive capsulitis, also known as frozen shoulder, is a common shoulder problem that manifests with progressive loss of glenohumeral motion with pain.44 Challoumas D, Biddle M, McLean M, Millar NL. Comparison of Treatments for Frozen Shoulder: A Systematic Review and Metaanalysis. JAMA Netw Open 2020;3(12):e2029581 This disease is one of the most common musculoskeletal problems seen in orthopaedics. This condition is quite prevalent in the orthopaedic population. Despite the prevalence of this problem and the advancements in shoulder surgery, many questions remain about the optimal course of therapy.55 Uppal HS, Evans JP, Smith C. Frozen shoulder: A systematic review of therapeutic options. World J Orthop 2015;6(02):263–268 With these unknowns, patients with adhesive capsulitis will likely use YouTube to explore treatment options.

Several studies have shown evidence that the educational quality of YouTube videos dealing with orthopaedic diseases is inadequate.11 Richardson MA, Park W, Bernstein DN, Mesfin A. Analysis of the quality, reliability, and educational content of YouTube videos concerning spine tumors. Int J Spine Surg 2022;16(02):278–282

2 O’Leary B, Saker C, StammMA, Mulcahey MK. YouTube videos lack efficacy as a patient education tool for rehabilitation and return to play following medial patellofemoral ligament reconstruction. Arthrosc Sports Med Rehabil 2022;4(03):e1111–e1118
-33 Umur L, Sürücü S Are YouTube videos a sufficient resource for informing patients in the treatment of rotator cuff tears? J Health Sci Med 2022;5(01):99–103,66 Muller AL, Baker JF. Analysis of lumbar fusion and lumbar arthroplasty videos on YouTube. Int J Spine Surg 2022;16(02): 283–290

7 Yu JS, Manzi JE, Apostolakos JM, Carr Ii JB, Dines JS. YouTube as a source of patient education information for elbowulnar collateral ligament injuries: a quality control content analysis. Clin Shoulder Elbow 2022;25(02):145–153
-88 Kwak D, Park JW, Won Y, Kwon Y, Lee JI. Quality and reliability evaluation of online videos on carpal tunnel syndrome: a YouTube video-based study. BMJ Open 2022;12(04):e059239 Only one study in the literature examines youtube videos related to adhesive capsulitis.99 Tang K, Azhar U, Babar M, et al. Assessing the Quality of YouTube Videos on Adhesive Capsulitis. Cureus 2022;14(07):e27406 The results of this study were consistent with those of other research. However, only videos that were relevant to a search keyword were used in their study.99 Tang K, Azhar U, Babar M, et al. Assessing the Quality of YouTube Videos on Adhesive Capsulitis. Cureus 2022;14(07):e27406 Our goal with this study was to examine the informativeness and overall quality of these videos by expanding the search phrases adhesive capsulitis patients might use to find them on YouTube. As with other studies in the literature, we assumed that these videos' quality and instructional quality needed to be improved.

Materials and Methods

On February 18, 2022, using Google Chrome (version 92.0.4515.159-64 bit) with the cache and cookies emptied, a search was performed on YouTube's database. Subjects included "frozen shoulder," "frozen shoulder treatment," "adhesive capsulitis," and "adhesive capsulitis treatment." The top 50 videos for each search keyword, as chosen by YouTube's algorithm based on "relevance," were included; this yielded a total of 200 videos for analysis.1010 Yüce A, İğde N, Ergün T, Mısır A. YouTube provides insufficient information on patellofemoral instability. Acta Orthop Traumatol Turc 2022;56(05):306–310 Videos were considered for inclusion if they met the following criteria: they were in English, their principal subject was about frozen shoulder, and the audio and visual quality were satisfactory. Videos were excluded if they were repetitive, had no dialogue, were in a language other than English, were not about adhesive capsulitis, or were categorized as news, drama, or satire. There was no maximum duration for videos, and compilations of numerous episodes were counted as a single work. A YouTube® account was set up for the research, and once duplicates were eliminated, a complete list of video URLs was compiled. Only 173 videos were included for the study due to the exclusion of 26 that were considered to be repetitive and one that were in a language other than English.

For each included YouTube video, the following attributes were recorded: (1) title, (2) video duration, (3) views, (4) video source/uploader, (5) content type, (6) days since upload, (7) view rate (views/day), and (8) likes. The authors and uploaders of the videos were classified into seven groups: (1) academic (related to authors or uploaders affiliated with research groups, universities, or colleges), (2) physician (related to independent physicians or groups of physicians without research or academic affiliation), (3) non-physicians (healthcare workers other than licensed medical doctors), (4) trainer, (5) medical source (content or animations from health websites), (6) patient, and (7) commercial. The content types were categorized as: (1) exercise education, (2) disease-specific information, (3) patient experience, (4) surgical technique or approach, (5) non-surgical management, and (6) advertising.

The criteria published in the Journal of the American Medical Association (JAMA) were used to evaluate the accuracy and reliability of the videos (Fig. 1).1111 Silberg WM, Lundberg GD, Musacchio RA. Assessing, controlling, and assuring the quality of medical information on the Internet: Caveant lector et viewor–Let the reader and viewer beware. JAMA 1997;277(15):1244–1245 Four factors, each weighted at 1, provide a generic evaluation of the credibility of the cited material. Accuracy and reliability are best represented by a score of 4, whereas a score of 0 shows the opposite. These criteria have been used extensively in the literature to assess the reliability of online resources, despite the fact that they have not been validated.1010 Yüce A, İğde N, Ergün T, Mısır A. YouTube provides insufficient information on patellofemoral instability. Acta Orthop Traumatol Turc 2022;56(05):306–310,1212 Kunze KN, Krivicich LM, Verma NN, Chahla J. Quality of online video resources concerning patient educationfor themeniscus:AYouTubebased quality-control study. Arthroscopy 2020;36(01):233–238

Fig. 1
JAMA criteria.

Three different scales were used to rate the educational value and quality of the adhesive capsulitis videos. Five factors are used to calculate the Global Quality Score (GQS) for educational content (Fig. 2).1010 Yüce A, İğde N, Ergün T, Mısır A. YouTube provides insufficient information on patellofemoral instability. Acta Orthop Traumatol Turc 2022;56(05):306–310,1313 Erdem MN, Karaca S. Evaluating the accuracy and quality of the information in kyphosis videos shared on YouTube. Spine 2018;43 (22):E1334–E1339 Quality education is represented by a maximum possible score of 5. The "Adhesive Capsulitis Specific Score" (ACSS) was developed for data pertaining to adhesive capsulitis, with its foundations on the recommendations made public by the American Academy of Orthopaedic Surgeons (Fig. 3). The ACSS is a 21-item questionnaire that assesses information about patient presentation and symptoms, adhesive capsulitis in general, diagnostic and assessment procedures, and available treatment choices. Higher quality is represented by a higher score up to a maximum of 21. Very good (21 points), good (16 points), moderate (12 points), poor (8 points), and very poor (4 points) were the range of possible ACSS ratings.1010 Yüce A, İğde N, Ergün T, Mısır A. YouTube provides insufficient information on patellofemoral instability. Acta Orthop Traumatol Turc 2022;56(05):306–310,1313 Erdem MN, Karaca S. Evaluating the accuracy and quality of the information in kyphosis videos shared on YouTube. Spine 2018;43 (22):E1334–E1339 The DISCERN score was created in Oxford, United Kingdom to evaluate the quality of health-related written materials. It consists of 16 questions, each of which is given a score between 1 and 5, giving a possible total of 6 to 80 (Fig. 4).1414 Charnock D, Shepperd S, Needham G, Gann R. DISCERN: an instrument for judging the quality of written consumer health information on treatment choices. J Epidemiol Community Health 1999;53(02):105–111 Poor (16–28 points), poor (29–41 points), fair (42–54 points), good (55–67 points), and excellent (68–80 points) are the quality categories.

Fig. 2
Global Quality Score.
Fig. 3
Adhesive Capsulitis Specific Score.
Fig. 4
DISCERN score.

The videos included in the study were determined by the non-observer author and presented to the observers in a table format. The videos were examined and scored blindly by two observers who had been trained in pre-evaluation scoring using DISCERN, GQS, JAMA, and ACSS. The Interclass Correlation Coefficient (ICC) was used to determine the level of agreement across observers, with values below 0.5 indicating low reliability, between 0.5-0.75 suggesting moderate reliability, between 0.75 and 0.9 indicating good reliability, and above 0.9 indicating excellent reliability.

IBM SPSS Statistics version 20 was used for the data analysis. Continuous data were summarized as means and standard deviations while categorical data were summarized as percentages and relative frequencies. The numbers were rounded to one decimal place.Video reliability as well as quality were compared among video sources and content using either one-way analysis of variance (ANOVA) or Kruskal-Wallis tests, depending on the data distribution. Differences between groups were examined using the Mann-Whitney U test for statistical significance. The level of agreement between the reviewers was determined using the Interclass Correlation Coefficient (ICC). Spearman's rank correlation coefficient was used to examine correlations between evaluations of videos' usefulness and their technical characteristics. Statistical significance was assumed when the p-value was less than 0.05.

Results

The averages of the features of the videos included in the study were: video duration 16.73 ± 123.09 minutes, number of views 264431.7 ± 617136.8, number of days after uploading 1537.95 ± 1159.3 days, view rate 269.75 ± 867.91 and number of likes 3826.78 ± 11595.45. Video source/uploader distribution 12 (6.9%) academic, 72 (41.6%) doctors, 71 (41%) non-physicians, 1 (0.6%) trainer, 13 (7.5%) medical sources, 2 (1.2%) were patients, and 2 (1.2%) were commercial. Looking at the contents of the videos, 44 (25.4%) were exercise training, 112 (64.7%) disease-specific information, 3 (1.7%) patient experience, 11 (6.4%) surgical technique/approach, and 3 (1.7%) included non-surgical management.

According to the JAMA criteria, 95.9% of the videos were rated 2 points or less. According to GQS, 27.7% of videos were rated 2 points or less. According to DISCERN criteria, 38 (21.9%) of the videos were very poor, 47 (27.2%) were poor, 62 (35.9%) were fair, 22 (12.7%) were good, and 4 (2.3%) were very good was evaluated. According to ACSS, 3 (1.7%) of the videos were very good, 31 (17.9%) good, 37 (21.4%) fair, 40 (23.1%) bad, and 62 (35.9%) vide rated very bad. The mean and standard deviation values of the scoring systems were JAMA 1.25 ± 0.51, DISCERN 39.4 ± 13.4, GQS 2.83 ± 0.96 and ACSS 7.43 ± 4.86, respectively. There were positive correlations between the number of views and GQS, between view rate and GQS, and between likes and GQS, DISCERN and ACSS (r:0.364, p < 0.001; r:0.414, p < 0.001; r:0.458, p < 0.001; r:0.265, p < 0.001; r:0.168, p < 0.027; respectively). There was no statistically significant difference between the median JAMA, GQS score and Discern Criteria values according to the video source/uploader (p > 0.05). The values of the scoring systems according to the video source/uploader are summarized in Table 1.

Table 1
Mean and standard deviation values of scores by video source/Uploader

A statistically significant difference was found between the median ACSS values according to the video source/uploader (p = 0.013). Here, the difference was seen between the ACSS median values of those whose video upload source was an instructor and those who were a medical source. The median ACSS value of the video upload source was 5, while the median PPIS value was 9 for the medical source (Table 2). No statistically significant difference was found between the median values of JAMA, GQS score, Discern Criteria, and ACSSS values according to content type (Table 3).

Table 2
Comparison of scores by video source/Uploader
Table 3
Comparison of scores by content type

Discussion

This study's essential findings are according to the JAMA criteria, 95.9% of the videos were rated 2 points or less. According to GQS, 27.7% of videos were rated 2 points or less. According to DISCERN criteria, 49.1% of the videos were evaluated as very poor or poor. According to the ACSS, 59% of the videos were rated as bad or very bad. These findings are similar to those of Tang et al.,99 Tang K, Azhar U, Babar M, et al. Assessing the Quality of YouTube Videos on Adhesive Capsulitis. Cureus 2022;14(07):e27406 which evaluated the educational and quality of adhesive capsulitis videos. This study has the feature of evaluating video reliability with JAMA scoring and evaluating more search terms and adhesive capsulitis videos that patients can search on YouTube. Another feature of this study is that there is no restriction on the duration of the video. Because as the duration of the videos increases, their information and educational content increase.88 Kwak D, Park JW, Won Y, Kwon Y, Lee JI. Quality and reliability evaluation of online videos on carpal tunnel syndrome: a YouTube video-based study. BMJ Open 2022;12(04):e059239 Failure to evaluate long videos may affect the research results by excluding highly educational videos. As a result of the comprehensive evaluation, this study concluded that the reliability, quality, and educational level of YouTube videos related to adhesive capsulitis needed to be improved.

Videos uploaded to YouTube do not go through an evaluation process.33 Umur L, Sürücü S Are YouTube videos a sufficient resource for informing patients in the treatment of rotator cuff tears? J Health Sci Med 2022;5(01):99–103 For this reason, the number of likes and views of the videos can create a quality video perception in patients and cause misinformation.1515 Goyal R,Mercado AE, Ring D, Crijns TJ.Most YouTube videos about carpal tunnel syndrome have the potential to reinforce misconceptions. Clin Orthop Relat Res 2021;479(10):2296–2302 As a result, the number of views of videos that are thought to be beneficial for patients may be less.1616 Jones M, Wiberg A. Evaluating Youtube as a source of patient information on Dupuytren’s disease. World J Plast Surg 2017;6 (03):396–398,1717 Staunton PF, Baker JF, Green J, Devitt A. Online Curves: A quality analysis of scoliosis videos on YouTube. Spine 2015;40(23): 1857–1861 In this study, between the number of likes and the scoring; and there was a positive correlation between the number of views and GQS. These findings show that patients tend to watch better quality videos of adhesive capsulitis and like the ones that are highly educational. Our findings can be interpreted as adhesive capsulitis patients prefer videos that are educational and of high quality, but the number of these videos is insufficient.

The instruction for YouTube videos may vary depending on the video uploader and source.1818 Koller U,WaldsteinW, Schatz KD,Windhager R. YouTube provides irrelevant information for the diagnosis and treatment of hip arthritis. Int Orthop 2016;40(10):1995–2002 Koller et al.,1818 Koller U,WaldsteinW, Schatz KD,Windhager R. YouTube provides irrelevant information for the diagnosis and treatment of hip arthritis. Int Orthop 2016;40(10):1995–2002 in their study evaluating videos about hip arthritis, found academic and doctor- sourced videos to be more educational. However, in this study, doctors or academic sources did not provide more educational information than other uploaders. Videos prepared for commercial purposes with commercial concerns may have negative consequences on the treatment of patients.1919 Desai T, Shariff A, Dhingra V,Minhas D, EureM, Kats M. Is content really king? An objective analysis of the public’s response to medical videos on YouTube. PLoS One 2013;8(12):e82469 The major cause for poor videos might be related to commercial concerns. Given that most films are made in accordance with the provider's practice and there is no doctor-patient liability obligation, most providers may feel free to advise viewers about only particular parts of the condition and treatment methods.33 Umur L, Sürücü S Are YouTube videos a sufficient resource for informing patients in the treatment of rotator cuff tears? J Health Sci Med 2022;5(01):99–103 This may cause patients with adhesive capsulitis to claim that the only treatment method offered is the right option and to request the wrong treatment. The solution to this situation may be to prepare patient information platforms without commercial concerns and to direct patients to these platforms.

Young patients use many social media platforms other than youtube to learn about their disease.2020 Curry E, Li X, Nguyen J, Matzkin E. Prevalence of internet and social media usage in orthopedic surgery. Orthop Rev (Pavia) 2014;6(03):5483 Artificially intelligent conversational agents (or "chatbots") have showed promise as direct patient engagement and tools for education, and Chat GPT is one such example.2121 Bibault JE, Chaix B, Guillemassé A, et al. A Chatbot Versus Physicians to Provide Information for Patients With Breast Cancer: Blind, Randomized Controlled Noninferiority Trial. J Med Internet Res 2019;21(11):e15787 These days' AI algorithms that deal with natural language are made to take in data that isn't neatly organized or standardized, and then provide results that sound human. These algorithms draw on a big corpus of previously written material by humans to create answers that have a high probability of matching the user's query. Chatbots have the potential to enhance medical care by supplying instantaneous answers to patient concerns, but because they are trained on language patterns rather than objective databases, they run the risk of giving patients erroneous but appearing reliable answers.2222 Sng GGR, Tung JYM, Lim DYZ, Bee YM. Potential and Pitfalls of ChatGPT and Natural-Language Artificial Intelligence Models for Diabetes Education. Diabetes Care 2023;46(05):e103–e105 In order to learn more about the capacity and educational value of the information that patients access about frozen shoulder on the internet, more information can be obtained through studies examining frozen shoulder data on different social media platforms. In addition, there is a need to evaluate the information that chatbots provide to patients about frozen shoulder. Considering these data, artificial intelligence models can be trained by doctors. In this way, slideshows and videos that provide accurate and reliable information to patients can be prepared with artificial intelligence support. It can be made available to patients.

This study has some limitations. Videos are continuously being added to YouTube, making it a dynamic platform. It also offers personalized videos using artificial intelligence. Therefore, the videos watched in searches may only partially reflect the videos presented to patients. We used internet provider software with cleared cookies and history to minimize personalized video presentation. However, previous studies have also used this method.1212 Kunze KN, Krivicich LM, Verma NN, Chahla J. Quality of online video resources concerning patient educationfor themeniscus:AYouTubebased quality-control study. Arthroscopy 2020;36(01):233–238,2323 Sahin AA, Boz M. Assessment of the quality and reliability of the information on lateral epicondylitis surgery on YouTube. Exp Biomed Res 2022;5(03):285–292 Again, using only English videos and searching only from one location may change the properties of the evaluated videos. Artificial intelligence can offer different videos according to countries and locations. Different non-English search terms and videos may have different informational content. Only the first 50 videos were evaluated for each search term. The evaluated videos represent a small fraction of the videos associated with adhesive capsulitis. Findings may change as the numberof videos evaluated increases. However, this method has been used before.1010 Yüce A, İğde N, Ergün T, Mısır A. YouTube provides insufficient information on patellofemoral instability. Acta Orthop Traumatol Turc 2022;56(05):306–310,1212 Kunze KN, Krivicich LM, Verma NN, Chahla J. Quality of online video resources concerning patient educationfor themeniscus:AYouTubebased quality-control study. Arthroscopy 2020;36(01):233–238 At the same time, although we evaluated the adhesive capsulitis videos by expanding the search terms in this study, we assume that we obtained similar data to the findings of Tang et al.99 Tang K, Azhar U, Babar M, et al. Assessing the Quality of YouTube Videos on Adhesive Capsulitis. Cureus 2022;14(07):e27406

The internet has made it easier than ever to access information on any topic imaginable. However, this also means a lot of misinformation and disinformation is available online. This can be a problem, as people may not be able to tell the difference between reliable and unreliable information. One way to address this problem is to filter information on the internet. This can be done by using software that identifies and blocks harmful or misleading content. However, it is important to note that no filtering system is perfect, and some false or misleading information may still slip through the cracks. Another way to address the problem is to educate people on critically evaluating information. This includes teaching people how to identify reliable sources of information, spot bias, and assess the quality of evidence. It is also important to be aware of the limitations of the internet. The internet is a vast and ever-changing resource, and it can be difficult to keep up with all the new information being published. This means that it is important to be skeptical of information that you find online, and to always do your own research before drawing any conclusions. Here are some tips for evaluating information on the internet: Consider the source of the information. Is it a credible website or organization? Look for evidence to support the claims being made. Are there any studies or statistics cited? Be aware of bias. Is the information coming from a biased source, such as a political party or a special interest group? Use your common sense. If something sounds too good to be true, it probably is. By following these tips, you can help ensure that you get accurate and reliable information from the internet.

Considering these findings, there is a need for educational and high-quality educational videos to inform patients. There should be clear and high-quality videos that deal with frozen shoulder as a whole, prepared by shoulder surgeons and their associations. These videos should be uploaded to public sites and patients should be directed to these videos. While preparing these videos, they can benefit from the information in https://www.mayoclinic.org/diseases-conditions/frozen-shoulder/symptoms-causes/syc-20372684, https://www.healthline.com/health/frozen-shoulder and https://orthoinfo.aaos.org/en/diseases-conditions/frozen-shoulder/. In addition, by training artificial intelligence software on this disease, many videos with high quality content on frozen shoulder can be prepared quickly and effectively. It can be made available to patients.

Conclusion

YouTube videos on adhesive capsulitis, thus, need to be of higher quality, reliability, and instructive quality. There is a need for reliable videos about adhesive capsulitis, with instructional and high-quality cited. In this way, patients can be directed to video sources with this quality video content.

  • Work developed at the Department of Orthopedic and Traumatology, Prof. Dr. Cemil Taşcıoğlu City Hospital, İstanbul, Turkey.
  • Financial Support
    There was no financial support from public, commercial, or non-profit sources.

References

  • 1
    Richardson MA, Park W, Bernstein DN, Mesfin A. Analysis of the quality, reliability, and educational content of YouTube videos concerning spine tumors. Int J Spine Surg 2022;16(02):278–282
  • 2
    O’Leary B, Saker C, StammMA, Mulcahey MK. YouTube videos lack efficacy as a patient education tool for rehabilitation and return to play following medial patellofemoral ligament reconstruction. Arthrosc Sports Med Rehabil 2022;4(03):e1111–e1118
  • 3
    Umur L, Sürücü S Are YouTube videos a sufficient resource for informing patients in the treatment of rotator cuff tears? J Health Sci Med 2022;5(01):99–103
  • 4
    Challoumas D, Biddle M, McLean M, Millar NL. Comparison of Treatments for Frozen Shoulder: A Systematic Review and Metaanalysis. JAMA Netw Open 2020;3(12):e2029581
  • 5
    Uppal HS, Evans JP, Smith C. Frozen shoulder: A systematic review of therapeutic options. World J Orthop 2015;6(02):263–268
  • 6
    Muller AL, Baker JF. Analysis of lumbar fusion and lumbar arthroplasty videos on YouTube. Int J Spine Surg 2022;16(02): 283–290
  • 7
    Yu JS, Manzi JE, Apostolakos JM, Carr Ii JB, Dines JS. YouTube as a source of patient education information for elbowulnar collateral ligament injuries: a quality control content analysis. Clin Shoulder Elbow 2022;25(02):145–153
  • 8
    Kwak D, Park JW, Won Y, Kwon Y, Lee JI. Quality and reliability evaluation of online videos on carpal tunnel syndrome: a YouTube video-based study. BMJ Open 2022;12(04):e059239
  • 9
    Tang K, Azhar U, Babar M, et al. Assessing the Quality of YouTube Videos on Adhesive Capsulitis. Cureus 2022;14(07):e27406
  • 10
    Yüce A, İğde N, Ergün T, Mısır A. YouTube provides insufficient information on patellofemoral instability. Acta Orthop Traumatol Turc 2022;56(05):306–310
  • 11
    Silberg WM, Lundberg GD, Musacchio RA. Assessing, controlling, and assuring the quality of medical information on the Internet: Caveant lector et viewor–Let the reader and viewer beware. JAMA 1997;277(15):1244–1245
  • 12
    Kunze KN, Krivicich LM, Verma NN, Chahla J. Quality of online video resources concerning patient educationfor themeniscus:AYouTubebased quality-control study. Arthroscopy 2020;36(01):233–238
  • 13
    Erdem MN, Karaca S. Evaluating the accuracy and quality of the information in kyphosis videos shared on YouTube. Spine 2018;43 (22):E1334–E1339
  • 14
    Charnock D, Shepperd S, Needham G, Gann R. DISCERN: an instrument for judging the quality of written consumer health information on treatment choices. J Epidemiol Community Health 1999;53(02):105–111
  • 15
    Goyal R,Mercado AE, Ring D, Crijns TJ.Most YouTube videos about carpal tunnel syndrome have the potential to reinforce misconceptions. Clin Orthop Relat Res 2021;479(10):2296–2302
  • 16
    Jones M, Wiberg A. Evaluating Youtube as a source of patient information on Dupuytren’s disease. World J Plast Surg 2017;6 (03):396–398
  • 17
    Staunton PF, Baker JF, Green J, Devitt A. Online Curves: A quality analysis of scoliosis videos on YouTube. Spine 2015;40(23): 1857–1861
  • 18
    Koller U,WaldsteinW, Schatz KD,Windhager R. YouTube provides irrelevant information for the diagnosis and treatment of hip arthritis. Int Orthop 2016;40(10):1995–2002
  • 19
    Desai T, Shariff A, Dhingra V,Minhas D, EureM, Kats M. Is content really king? An objective analysis of the public’s response to medical videos on YouTube. PLoS One 2013;8(12):e82469
  • 20
    Curry E, Li X, Nguyen J, Matzkin E. Prevalence of internet and social media usage in orthopedic surgery. Orthop Rev (Pavia) 2014;6(03):5483
  • 21
    Bibault JE, Chaix B, Guillemassé A, et al. A Chatbot Versus Physicians to Provide Information for Patients With Breast Cancer: Blind, Randomized Controlled Noninferiority Trial. J Med Internet Res 2019;21(11):e15787
  • 22
    Sng GGR, Tung JYM, Lim DYZ, Bee YM. Potential and Pitfalls of ChatGPT and Natural-Language Artificial Intelligence Models for Diabetes Education. Diabetes Care 2023;46(05):e103–e105
  • 23
    Sahin AA, Boz M. Assessment of the quality and reliability of the information on lateral epicondylitis surgery on YouTube. Exp Biomed Res 2022;5(03):285–292

Publication Dates

  • Publication in this collection
    17 June 2024
  • Date of issue
    2024

History

  • Received
    16 July 2023
  • Accepted
    04 Sept 2023
Sociedade Brasileira de Ortopedia e Traumatologia Al. Lorena, 427 14º andar, 01424-000 São Paulo - SP - Brasil, Tel.: 55 11 2137-5400 - São Paulo - SP - Brazil
E-mail: rbo@sbot.org.br