}yh SSKrSSKrSSKJr S S\S\S\\R4Sjjr S\S\\R4Sjr S\RS\R4S jr g) N) BeautifulSoupurlhandle_special_charsreturnc[U5nU(a![U5Hup4[U5X#'M U(aU$[SU35 /$![an[SU35 /sSnA$SnAff=f)ai Fetch tables from a Wikipedia URL with robust error handling. Parameters: ----------- url : str The Wikipedia URL to fetch tables from. handle_special_chars : bool, default True Whether to clean special characters in data before parsing. Returns: -------- list of pd.DataFrame A list of pandas DataFrames containing the tables found on the page. zNo tables found at zError fetching tables: N)_fetch_tables_with_bs4 enumerate _clean_tableprint Exception)rr all_tablesitablees d/Users/halyna/Desktop/courses/HuggingFaceAgents/Final_Assignment_Template/wikipedia_tables_parser.pyfetch_wikipedia_tablesrsv&+C0 %j1 ,U 3 2   'u- .I  's+, s";AA A2A-'A2-A2ch[R"U5nUR5 [URS5n/nUR SSS05GHn/n/nUR S5H,nUR URR55 M. U(dhURS5(aRURS5R SS/5H,nUR URR55 M. U(aUR S5SS OUR S5Hbn /n U R SS/5H,n U R U RR55 M. U (dMQUR U 5 Md U(dGMTU(a0[U5[US 5:Xa[R"XVS 9n O[R"U5n UR U 5 GM U$![an [S U 35 /sS n A $S n A ff=f) z+Method to fetch tables using BeautifulSoup.z html.parserrclass wikitablethtrtdNr)columnsz!Error in BeautifulSoup fallback: )requestsgetraise_for_statusrcontentfind_allappendtextstripfindlenpd DataFramer r )rresponsesouptablesrdataheadersrrrowrow_datacelldfrs rrr+s&<<$!!#X--}=]]7Wk,BCEDGnnT*rww}}/+uzz$//**T*33T4LABNN277==?3B4;u~~d+AB/t@TTLL$6DOODIIOO$5678KK) Uts7|s47|;drEst "&" "" ",,"J((R\\(:(VR\\bllr?