a JBOf@sFddlmZmZddlZddlZddlZddlZddlZdZdZ dZ dZ dZ dZ dZdZdZdZdZdZdZd Zd Zd ZdZd ZdZdZdZd ZdZd ZdZ dZ!dZ"d dZ#ddZ$e rde vre dure#n8ere%eZee e e e eeeeeeeeeeeee de dkr(e$e r(e&ddZ e'dkrBej(eee ddS))appsave_worker_configNz0.0.0.0i@z2023-07-01-previewFgi Tdc CszHddg}ttjd }tj|||d}Wdn1s<0YWn4ty|}ztd|dWYd}~n d}~00dS)Nollamaservew)stdoutstderrzd LiteLLM Warning: proxy started with `ollama` model `ollama serve` failed with Exceptionz). Ensure you run `ollama serve` )openosdevnull subprocessPopen Exceptionprint)commandrprocesser/home/user/app/main.pyrun_ollama_serve$s2rcCsNddl}||j|j"}|d|fdkWdS1s@0YdS)Nr localhost)socketAF_INET SOCK_STREAM connect_ex)portrsrrris_port_in_use1sr r)modelaliasapi_base api_versiondebugdetailed_debug temperature max_tokensrequest_timeout max_budget telemetry drop_paramsadd_function_to_promptheaderssaveconfig use_queueii__main__)hostr))Z proxy_serverrruvicornrandomrjsonr r3rr#r$r!r"Zadd_keyr.r/r%r&r'r(r)r,r-r0r*r+testlocalZ num_workersZ test_asyncZ num_requestsr1Zhealthversionrr loadsrandint__name__runrrrrsx