Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

"415 internal server error" for testing microservices on Windows #362

Closed
arun-gupta opened this issue Jul 2, 2024 · 1 comment
Closed
Assignees
Labels

Comments

@arun-gupta
Copy link
Contributor

arun-gupta commented Jul 2, 2024

Configuration:

Edition	Windows 11 Home
Version	23H2
Installed on	‎2/‎29/‎2024
OS build	22631.3737
Experience	Windows Feature Experience Pack 1000.22700.1009.0

Hardware:

Device name	MSI
Processor	Intel(R) Core(TM) Ultra 9 185H   2.50 GHz
Installed RAM	32.0 GB (31.7 GB usable)
Device ID	924E52E5-2497-4C8D-88AE-31078A244EC7
Product ID	00342-21176-10892-AAOEM
System type	64-bit operating system, x64-based processor
Pen and touch	No pen or touch input is available for this display

Followed the demo instructions at https://github.com/opea-project/GenAIExamples/blob/main/ChatQnA/docker/xeon/README.md. Used localhost for all ${host_ip}

All the images were built successfully:

inteldemo@MSI MINGW64 ~/GenAIExamples/ChatQnA/docker/xeon (main)
$ docker images
REPOSITORY                                      TAG        IMAGE ID       CREATED        SIZE
opea/chatqna-conversation-ui                    latest     5e2af02d54f2   2 hours ago    43.9MB
opea/chatqna-ui                                 latest     32da202e81dd   2 hours ago    1.5GB
opea/chatqna                                    latest     e9a62b2ae517   2 hours ago    757MB
opea/dataprep-redis                             latest     c807e2422696   2 hours ago    4.16GB
opea/llm-tgi                                    latest     039d3e47ed08   3 hours ago    2.45GB
opea/reranking-tei                              latest     cc025381f00a   3 hours ago    2.58GB
opea/retriever-redis                            latest     59843be386dd   3 hours ago    3.75GB
opea/embedding-tei                              latest     8218df07dd3d   3 hours ago    3.48GB
ghcr.io/huggingface/text-embeddings-inference   cpu-1.2    51c71b7cb250   2 months ago   637MB
ghcr.io/huggingface/text-generation-inference   1.4        6fdbcd247b93   3 months ago   10.2GB
redis/redis-stack                               7.2.0-v9   59d6058ec513   3 months ago   791MB

Docker compose command completed:


 Container tgi-service  Starting
 Container tei-embedding-server  Starting
 Container tei-reranking-server  Started
 Container reranking-tei-xeon-server  Starting
 Container tei-embedding-server  Started
 Container embedding-tei-server  Starting
 Container redis-vector-db  Started
 Container retriever-redis-server  Starting
 Container dataprep-redis-server  Starting
 Container tgi-service  Started
 Container llm-tgi-server  Starting
 Container reranking-tei-xeon-server  Started
 Container embedding-tei-server  Started
 Container retriever-redis-server  Started
 Container dataprep-redis-server  Started
 Container llm-tgi-server  Started
 Container chatqna-xeon-backend-server  Starting
 Container chatqna-xeon-backend-server  Started
 Container chatqna-xeon-ui-server  Starting
 Container chatqna-xeon-conversation-ui-server  Starting
 Container chatqna-xeon-conversation-ui-server  Started
 Container chatqna-xeon-ui-server  Started

inteldemo@MSI MINGW64 ~/GenAIExamples/ChatQnA/docker/xeon (main)

Testing the microservices gives the following error:

inteldemo@MSI MINGW64 ~/GenAIExamples/ChatQnA/docker/xeon (main)
$ curl ${host_ip}:6006/embed \
    -X POST \
    -d '{"inputs":"What is Deep Learning?"}' \
    -H 'Content-Type: application/json'
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0[[0.00030899697,-0.06356526,0.0025720652,-0.012404508,0.050649904,0.023426017,0.022131795,0.00075951463,-0.00021142521,-0.033512212,-0.024963321,0.0064628697,-0.0070548384,0.066674575,0.0013026667,0.046839856,0.062720366,-0.021033818,0.0112144975,0.043999918,-0.050784636,-0.062210076,-0.040182423,0.017779345,-0.0013302173,0.002215643,-0.043744683,0.012752024,-0.023972675,0.01119995,0.028703943,-0.008989915,0.03712502,-0.027488008,0.016138766,0.041751806,-0.03958112,-0.035287693,-0.022453416,-0.019844959,-0.018594246,-0.042406045,-0.012047601,0.049004458,-0.08094748,0.017947476,-0.12090017,0.0023762546,-0.02272189,-0.012267102,-0.07537692,0.051195927,0.032084882,-0.019142164,0.04288559,0.0152152665,0.0042946106,-0.08067343,0.010296494,-0.056292143,0.051881846,0.037080567,-0.018511584,-0.027629044,-0.001054266,-0.026184885,0.024228673,0.04285823,-0.023303814,-0.0034123294,-0.028686302,0.029237151,-0.020652862,-0.005005631,-0.052511707,-0.011031124,0.012807088,0.0143450415,0.08218709,-0.008386845,0.0036734012,0.0623607,0.042553645,0.031580802,0.0046311296,0.00079937384,-0.019410733,-0.004640386,-0.044894125,0.022581208,0.01038095,-0.05308422,0.06013538,0.05144768,0.014172977,0.007601365,0.013753233,-0.035371605,-0.011681974,-0.014776072,-0.023268491,-0.059066407,-0.016947111,-0.0146323,-0.04834379,0.026675664,0.052418787,-0.013986442,0.014608616,-0.019658025,-0.0014042922,-0.008499037,-0.0025460322,-0.04859001,-0.042939726,-0.0079117315,-0.016442303,0.0038054115,-0.025010245,-0.045991942,0.034305308,0.03829385,0.0019499372,0.021234484,-0.034113366,0.015423016,0.0040040873,0.018236266,0.0045665917,-0.026942607,0.020603687,0.016867664,-0.007864123,0.021867175,-0.014774443,0.0007819906,-0.020355174,0.006654402,0.025772752,0.009957335,-0.0025281627,-0.057999343,0.0300994,-0.03549674,0.054396067,-0.015254173,-0.007988703,-0.004305164,-0.018912135,0.0027840636,-0.04450435,0.055565003,-0.018894104,-0.049442366,0.008305449,0.039805062,-0.000429136,0.0059957677,0.034555886,0.023066118,0.058901984,-0.019604886,-0.05472664,-0.0099288635,-0.024551354,-0.054289196,0.055403367,0.02450303,-0.019979108,0.025056949,-0.0020133902,-0.011331983,0.020181509,-0.012020923,0.0117187165,0.047295775,0.028600171,0.034037627,0.043115035,0.05144504,-0.065478705,0.046462696,-0.008938428,-0.0063705915,-0.04479706,-0.03157799,0.049502853,-0.010792511,0.03688509,0.014347538,-0.06374349,-0.036214318,-0.03380072,-0.0376926,0.033050828,-0.0169998,-0.015086943,0.082186654,-0.011051282,0.04645045,0.054343414,-0.05152059,0.015258483,-0.016340418,-0.02720561,0.029828804,0.01575666,-0.04375621,-0.0032172215,0.0033928412,0.007628404,-0.049441986,-0.005387016,0.0014642925,0.043246083,0.030448541,-0.007991404,-0.0047274325,0.0065691476,-0.01804507,0.0050486224,-0.042211335,0.024785548,0.0029737204,0.008309055,0.08794757,0.04115058,-0.051644947,0.03518449,-0.037274394,0.0036771789,0.024684008,-0.012615989,0.019353464,0.013835094,-0.027715964,0.014544031,0.010486947,0.04520827,-0.033490576,-0.07057738,0.00699025,-0.04745943,0.052707404,0.011758967,0.009585283,0.033369742,-0.014058901,-0.01459584,-0.016755702,-0.0045422744,0.00010270776,0.016674507,0.029076934,-0.023981523,-0.059065636,0.002109095,-0.009751248,0.102899395,0.02745967,-0.050843984,0.05147338,-0.027577639,0.022293232,-0.025467247,-0.09516217,-0.028346876,-0.020029744,0.08765647,-0.014138417,0.048151586,0.0074673737,0.039309114,0.000087185785,-0.026958024,0.0055812094,0.054877758,0.055222683,-0.012584481,-0.043458436,-0.02426133,0.06653335,0.0056506936,-0.015095112,0.027254747,-0.025936838,-0.0030386562,-0.00860542,-0.008919004,0.0043280493,0.03594558,0.06164933,-0.04236956,0.048818752,0.021097478,0.053623397,0.04589017,-0.027605081,-0.01573273,0.000083086095,-0.0070444373,0.0395588,-0.021737646,0.03881642,0.020095289,-0.013099421,0.07956598,-0.014619628,-0.19659396,-0.012995385,0.017993053,-0.0073582768,0.038134675,-0.059302043,-0.005811138,-0.009954005,0.0018040048,-0.023058381,-0.02710292,-0.0065945317,0.038011655,0.025225779,0.019853845,-0.016618732,0.008755803,-0.01653901,-0.036775675,0.045325346,-0.031573772,-0.029247263,-0.012535293,0.07143945,-0.02914509,0.027142365,-0.08479946,-0.05071045,-0.0028705185,-0.0021605464,-0.023848962,-0.028478866,-0.032443713,0.048623245,0.023280833,0.01637239,0.027676737,-0.039900705,-0.0024989978,0.01773907,-0.03355719,-0.04860383,0.00301992,-0.04088795,0.044802625,0.015728964,-0.093100004,-0.04836614,-0.01483128,0.0010454011,-0.010638663,-0.02461172,-0.06786175,-0.0013613619,0.015592533,-0.004870536,0.0025347015,-0.012120935,-0.02482483,0.036656898,-0.0031881125,-0.020234706,-0.022797598,-0.0592249,-0.02092267,-0.023175104,-0.0610787,-0.062339265,0.017110297,0.033383276,-0.010112492,0.048114073,-0.06444787,-0.048520792,0.0068651023,-0.025729267,-0.029516444,-0.009418308,0.054844197,0.027107926,0.008253276,-0.062844604,0.03546614,0.012162146,-0.009598874,-0.0485616,0.046412975,-0.03714827,-0.020295277,-0.028690845,0.06459796,-0.0064281337,-0.026629835,-0.02635535,0.035041183,0.01987308,0.0032821842,0.028802492,-0.013105749,0.019568231,-0.021280019,-0.02427015,-0.043821957,-0.016565608,-0.040926397,-0.022030227,-0.009905911,0.030040646,0.1012591,-0.0026321155,-0.037816178,0.014336948,0.025456354,0.0010047199,0.0003262963,-0.030703934,0.016242757,0.0013897644,0.018662378,-0.038746424,-0.032084696,0.055992708,0.0056111086,0.04541299,0.015634662,-0.029560229,0.0008552224,0.015237,0.019173656,-0.025870929,0.020953285,-0.00036685387,0.012462423,0.008920612,-0.0016022595,-0.012868542,-0.010962314,-0.0068797567,-0.009876271,0.009545098,-0.007622595,0.001660832,0.01671911,-0.015954103,-0.020932058,0.049466476,-0.07352468,0.060834482,-0.0069076903,-0.014720607,0.014687681,-0.02875842,0.025296502,-0.058295075,0.030022781,-0.007054804,0.010030845,-0.00652782,-0.028693678,-0.04413149,0.01002005,0.030309634,-0.00998547,0.010452934,0.055963207,0.054369792,-0.026280781,-0.06169521,0.031318307,0.012127439,0.034067024,-0.029661579,-0.008471427,-0.031715445,-0.014869144,0.03665238,0.026443252,-0.0055861464,0.024890423,0.05881062,0.017560579,0.039287418,-0.0034399445,0.033162832,0.050130937,0.03299277,-0.029766118,0.0061241305,-0.055100102,0.028030403,-0.038324993,0.024334628,-0.017313356,-0.019499589,-0.019817937,-0.027658423,-0.018781582,0.047175232,-0.003472117,-0.020667769,-0.03978185,-0.019210778,-0.026337985,-0.02323412,0.049640268,-0.07777431,0.030660948,0.048808888,0.04491368,0.036741752,-0.011647869,-0.027568433,-0.07255591,-0.08764577,-0.03934319,-0.042038593,-0.003966631,0.016718008,0.026770968,-0.030261416,0.02998671,0.02428943,0.011788728,-0.0122292405,-0.047474027,-0.036679313,0.026632799,0.036359917,0.0005169108,0.01799114,0.009195603,-0.006913799,0.011830239,-0.005349269,-0.034725424,0.031615555,-0.052876294,0.014696602,-0.014054978,-0.016312817,0.0019934159,0.025263377,-0.07060641,0.010108239,-0.01411662,-0.0059261634,-0.008993779,0.021177074,-0.04376874,-0.02805678,0.060908116,0.003902118,-0.03858403,-0.048930343,0.023969028,-0.059767623,-0.02908713,-0.055471133,-0.06936628,-0.0057828883,-0.02213405,-0.008931032,-0.005646651,0.029871983,0.022359844,0.008790432,-0.039745193,-0.00640241,0.065675244,-0.015728954,-0.037464947,-0.06175809,-0.028639691,0.08637485,0.031286187,-0.0007831678,0.0030584005,0.012293265,0.02000849,-0.028351257,0.002015844,0.02708429,0.0027893288,-0.03614264,0.0060403803,-0.047539476,-0.004725403,-0.021484239,-0.022895494,-0.015276984,-0.043213036,-0.044127394,-0.005665994,-0.0094537465,-0.028690211,0.010030023,0.027899044,0.06033615,0.069364175,0.0069057452,-0.024200335,0.049070764,0.003140092,0.0044176257,-0.029459648,0.038031768,-0.03538271,-0.048950672,0.04761868,0.0073122242,-0.008343223,-0.03525189,0.03683282,0.02466352,-0.03892742,0.018956872,0.013805375,-0.04843696,-0.048294622,0.022492655,-0.029296733,0.04137573,0.046585515,0.020296946,0.037896816,0.059837118,0.011104047,-0.032134634,0.07064704,0.04802413,0.017300114,0.073981084,-0.049616676,0.07330994,-0.009424943,-0.062819205,0.024277374,0.021769991,0.018801004,0.020460337,-0.017282085100  9579  100  9544  100    35   195k    733 --:--:-- --:--:-- --:--:--  199k0.025109157,0.013359596,-0.02118922,0.025518067,-0.04860922,0.035189535,0.08076792,0.003792744,-0.015581171,0.0021879831,0.032584462,0.11597607,-0.021879189,-0.02999133,0.01615562,-0.0064808084,-0.060506415,-0.005632672,0.028292095,-0.02181107,0.03276039,-0.021999665,-0.03470849,0.011786814,-0.035356957,-0.014913305,-0.03978598,-0.021320345,0.026806006,-0.002236287,0.044643823,-0.015494696,-0.0065790378,0.0066197272,-0.005021751,-0.077643394,0.054302536,0.027956652,-0.039835002,-0.027030367,-0.024944978,-0.0022802348,0.07870786,-0.03415704,0.037108667,0.044204082,0.012753776,0.0037155522,0.0082548885,0.013719672,-0.010619105,-0.021691218,0.057942715,-0.075987175,-0.05417166,0.0038933095,0.003980657,-0.037909333,-0.030339945,0.063346796,-0.088324875,-0.060955826,0.085157,0.020458037,0.080888115,0.032549445,0.0039249854,0.029362097,0.012281513,-0.06369544,0.023577843,-0.017478345,-0.0016189145,0.017345957,0.043068394,0.04959019,0.028447438,0.021328136,-0.002505349,-0.030895218,-0.055287413,-0.04561063,0.042167645,-0.027732661,-0.036629573,0.028555468,0.066825,-0.061748873,-0.08889238,0.04591413,-0.004745289,0.034891937,-0.006536417,-0.0069724475,-0.061335552,0.021299088,-0.027769834,-0.024667798,0.039991785,0.037477136,-0.0068066986,0.02261458,-0.045707367,-0.03312277,0.022785502,0.016002567,-0.021343611,-0.029969858,-0.0049175993]]

inteldemo@MSI MINGW64 ~/GenAIExamples/ChatQnA/docker/xeon (main)
$ curl http://${host_ip}:6000/v1/embeddings\
  -X POST \
  -d '{"text":"hello"}' \
  -H 'Content-Type: application/json'
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100    37  100    21  100    16    235    179 --:--:-- --:--:-- --:--:--   415Internal Server Error

Here are docker-compose logs

$ docker compose -f docker_compose.yaml logs
time="2024-07-01T17:58:26-07:00" level=warning msg="The \"http_proxy\" variable is not set. Defaulting to a blank string."
time="2024-07-01T17:58:26-07:00" level=warning msg="The \"https_proxy\" variable is not set. Defaulting to a blank string."
time="2024-07-01T17:58:26-07:00" level=warning msg="The \"no_proxy\" variable is not set. Defaulting to a blank string."
time="2024-07-01T17:58:26-07:00" level=warning msg="The \"no_proxy\" variable is not set. Defaulting to a blank string."
time="2024-07-01T17:58:26-07:00" level=warning msg="The \"LANGCHAIN_API_KEY\" variable is not set. Defaulting to a blank string."
time="2024-07-01T17:58:26-07:00" level=warning msg="The \"http_proxy\" variable is not set. Defaulting to a blank string."
time="2024-07-01T17:58:26-07:00" level=warning msg="The \"https_proxy\" variable is not set. Defaulting to a blank string."
time="2024-07-01T17:58:26-07:00" level=warning msg="The \"LANGCHAIN_TRACING_V2\" variable is not set. Defaulting to a blank string."
time="2024-07-01T17:58:26-07:00" level=warning msg="The \"no_proxy\" variable is not set. Defaulting to a blank string."
time="2024-07-01T17:58:26-07:00" level=warning msg="The \"https_proxy\" variable is not set. Defaulting to a blank string."
time="2024-07-01T17:58:26-07:00" level=warning msg="The \"http_proxy\" variable is not set. Defaulting to a blank string."
time="2024-07-01T17:58:26-07:00" level=warning msg="The \"http_proxy\" variable is not set. Defaulting to a blank string."
time="2024-07-01T17:58:26-07:00" level=warning msg="The \"LANGCHAIN_API_KEY\" variable is not set. Defaulting to a blank string."
time="2024-07-01T17:58:26-07:00" level=warning msg="The \"LANGCHAIN_TRACING_V2\" variable is not set. Defaulting to a blank string."
time="2024-07-01T17:58:26-07:00" level=warning msg="The \"no_proxy\" variable is not set. Defaulting to a blank string."
time="2024-07-01T17:58:26-07:00" level=warning msg="The \"https_proxy\" variable is not set. Defaulting to a blank string."
time="2024-07-01T17:58:26-07:00" level=warning msg="The \"https_proxy\" variable is not set. Defaulting to a blank string."
time="2024-07-01T17:58:26-07:00" level=warning msg="The \"no_proxy\" variable is not set. Defaulting to a blank string."
time="2024-07-01T17:58:26-07:00" level=warning msg="The \"http_proxy\" variable is not set. Defaulting to a blank string."
time="2024-07-01T17:58:26-07:00" level=warning msg="The \"http_proxy\" variable is not set. Defaulting to a blank string."
time="2024-07-01T17:58:26-07:00" level=warning msg="The \"https_proxy\" variable is not set. Defaulting to a blank string."
time="2024-07-01T17:58:26-07:00" level=warning msg="The \"LANGCHAIN_TRACING_V2\" variable is not set. Defaulting to a blank string."
time="2024-07-01T17:58:26-07:00" level=warning msg="The \"no_proxy\" variable is not set. Defaulting to a blank string."
time="2024-07-01T17:58:26-07:00" level=warning msg="The \"LANGCHAIN_API_KEY\" variable is not set. Defaulting to a blank string."
time="2024-07-01T17:58:26-07:00" level=warning msg="The \"no_proxy\" variable is not set. Defaulting to a blank string."
time="2024-07-01T17:58:26-07:00" level=warning msg="The \"https_proxy\" variable is not set. Defaulting to a blank string."
time="2024-07-01T17:58:26-07:00" level=warning msg="The \"http_proxy\" variable is not set. Defaulting to a blank string."
time="2024-07-01T17:58:26-07:00" level=warning msg="The \"no_proxy\" variable is not set. Defaulting to a blank string."
time="2024-07-01T17:58:26-07:00" level=warning msg="The \"http_proxy\" variable is not set. Defaulting to a blank string."
time="2024-07-01T17:58:26-07:00" level=warning msg="The \"https_proxy\" variable is not set. Defaulting to a blank string."
time="2024-07-01T17:58:26-07:00" level=warning msg="The \"no_proxy\" variable is not set. Defaulting to a blank string."
time="2024-07-01T17:58:26-07:00" level=warning msg="The \"http_proxy\" variable is not set. Defaulting to a blank string."
time="2024-07-01T17:58:26-07:00" level=warning msg="The \"https_proxy\" variable is not set. Defaulting to a blank string."
time="2024-07-01T17:58:26-07:00" level=warning msg="The \"no_proxy\" variable is not set. Defaulting to a blank string."
time="2024-07-01T17:58:26-07:00" level=warning msg="The \"http_proxy\" variable is not set. Defaulting to a blank string."
time="2024-07-01T17:58:26-07:00" level=warning msg="The \"https_proxy\" variable is not set. Defaulting to a blank string."
time="2024-07-01T17:58:26-07:00" level=warning msg="The \"LANGCHAIN_API_KEY\" variable is not set. Defaulting to a blank string."
time="2024-07-01T17:58:26-07:00" level=warning msg="The \"LANGCHAIN_TRACING_V2\" variable is not set. Defaulting to a blank string."
time="2024-07-01T17:58:26-07:00" level=warning msg="The \"no_proxy\" variable is not set. Defaulting to a blank string."
time="2024-07-01T17:58:26-07:00" level=warning msg="The \"https_proxy\" variable is not set. Defaulting to a blank string."
time="2024-07-01T17:58:26-07:00" level=warning msg="The \"http_proxy\" variable is not set. Defaulting to a blank string."
time="2024-07-01T17:58:26-07:00" level=warning msg="C:\\Users\\inteldemo\\GenAIExamples\\ChatQnA\\docker\\xeon\\docker_compose.yaml: `version` is obsolete"
retriever-redis-server  | exec /home/user/comps/retrievers/langchain/redis/run.sh: no such file or directory
retriever-redis-server  | exec /home/user/comps/retrievers/langchain/redis/run.sh: no such file or directory
retriever-redis-server  | exec /home/user/comps/retrievers/langchain/redis/run.sh: no such file or directory
embedding-tei-server    | /usr/local/lib/python3.11/site-packages/pydantic/_internal/_fields.py:149: UserWarning: Field "model_name_or_path" has conflict with protected namespace "model_".
embedding-tei-server    |
retriever-redis-server  | exec /home/user/comps/retrievers/langchain/redis/run.sh: no such file or directory
retriever-redis-server  | exec /home/user/comps/retrievers/langchain/redis/run.sh: no such file or directory
retriever-redis-server  | exec /home/user/comps/retrievers/langchain/redis/run.sh: no such file or directory
retriever-redis-server  | exec /home/user/comps/retrievers/langchain/redis/run.sh: no such file or directory
embedding-tei-server    | You may be able to resolve this warning by setting `model_config['protected_namespaces'] = ()`.
retriever-redis-server  | exec /home/user/comps/retrievers/langchain/redis/run.sh: no such file or directory
retriever-redis-server  | exec /home/user/comps/retrievers/langchain/redis/run.sh: no such file or directory
retriever-redis-server  | exec /home/user/comps/retrievers/langchain/redis/run.sh: no such file or directory
embedding-tei-server    |   warnings.warn(
retriever-redis-server  | exec /home/user/comps/retrievers/langchain/redis/run.sh: no such file or directory
retriever-redis-server  | exec /home/user/comps/retrievers/langchain/redis/run.sh: no such file or directory
retriever-redis-server  | exec /home/user/comps/retrievers/langchain/redis/run.sh: no such file or directory
retriever-redis-server  | exec /home/user/comps/retrievers/langchain/redis/run.sh: no such file or directory
embedding-tei-server    | [2024-07-01 23:53:48,888] [    INFO] - CORS is enabled.
embedding-tei-server    | [2024-07-01 23:53:48,889] [    INFO] - Setting up HTTP server
embedding-tei-server    | [2024-07-01 23:53:48,890] [    INFO] - Uvicorn server setup on port 6000
embedding-tei-server    | INFO:     Waiting for application startup.
retriever-redis-server  | exec /home/user/comps/retrievers/langchain/redis/run.sh: no such file or directory
retriever-redis-server  | exec /home/user/comps/retrievers/langchain/redis/run.sh: no such file or directory
retriever-redis-server  | exec /home/user/comps/retrievers/langchain/redis/run.sh: no such file or directory
embedding-tei-server    | INFO:     Application startup complete.
embedding-tei-server    | INFO:     Uvicorn running on http://0.0.0.0:6000 (Press CTRL+C to quit)
dataprep-redis-server   | /home/user/.local/lib/python3.11/site-packages/pydantic/_internal/_fields.py:161: UserWarning: Field "model_name_or_path" has conflict with protected namespace "model_".
embedding-tei-server    | [2024-07-01 23:53:49,041] [    INFO] - HTTP server setup successful
embedding-tei-server    | TEI Gaudi Embedding initialized.
retriever-redis-server  | exec /home/user/comps/retrievers/langchain/redis/run.sh: no such file or directory
embedding-tei-server    | INFO:     172.18.0.1:59516 - "POST /v1/embeddings HTTP/1.1" 500 Internal Server Error
retriever-redis-server  | exec /home/user/comps/retrievers/langchain/redis/run.sh: no such file or directory
embedding-tei-server    | ERROR:    Exception in ASGI application
dataprep-redis-server   |
embedding-tei-server    | Traceback (most recent call last):
dataprep-redis-server   | You may be able to resolve this warning by setting `model_config['protected_namespaces'] = ()`.
retriever-redis-server  | exec /home/user/comps/retrievers/langchain/redis/run.sh: no such file or directory
retriever-redis-server  | exec /home/user/comps/retrievers/langchain/redis/run.sh: no such file or directory
embedding-tei-server    |   File "/usr/local/lib/python3.11/site-packages/urllib3/connection.py", line 203, in _new_conn
dataprep-redis-server   |   warnings.warn(
embedding-tei-server    |     sock = connection.create_connection(
embedding-tei-server    |            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
embedding-tei-server    |   File "/usr/local/lib/python3.11/site-packages/urllib3/util/connection.py", line 85, in create_connection
dataprep-redis-server   | /home/user/.local/lib/python3.11/site-packages/langchain/__init__.py:29: UserWarning: Importing LLMChain from langchain root module is no longer supported. Please use langchain.chains.LLMChain instead.
dataprep-redis-server   |   warnings.warn(
retriever-redis-server  | exec /home/user/comps/retrievers/langchain/redis/run.sh: no such file or directory
dataprep-redis-server   | /home/user/.local/lib/python3.11/site-packages/langchain/__init__.py:29: UserWarning: Importing PromptTemplate from langchain root module is no longer supported. Please use langchain_core.prompts.PromptTemplate instead.
dataprep-redis-server   |   warnings.warn(
dataprep-redis-server   | [2024-07-01 23:53:53,122] [    INFO] - CORS is enabled.
dataprep-redis-server   | [2024-07-01 23:53:53,123] [    INFO] - Setting up HTTP server
embedding-tei-server    |     raise err
embedding-tei-server    |   File "/usr/local/lib/python3.11/site-packages/urllib3/util/connection.py", line 73, in create_connection
dataprep-redis-server   | [2024-07-01 23:53:53,124] [    INFO] - Uvicorn server setup on port 6007
embedding-tei-server    |     sock.connect(sa)
dataprep-redis-server   | INFO:     Waiting for application startup.
embedding-tei-server    | ConnectionRefusedError: [Errno 111] Connection refused
embedding-tei-server    |
retriever-redis-server  | exec /home/user/comps/retrievers/langchain/redis/run.sh: no such file or directory
embedding-tei-server    | The above exception was the direct cause of the following exception:
embedding-tei-server    |
dataprep-redis-server   | INFO:     Application startup complete.
embedding-tei-server    | Traceback (most recent call last):
embedding-tei-server    |   File "/usr/local/lib/python3.11/site-packages/urllib3/connectionpool.py", line 790, in urlopen
embedding-tei-server    |     response = self._make_request(
embedding-tei-server    |                ^^^^^^^^^^^^^^^^^^^
embedding-tei-server    |   File "/usr/local/lib/python3.11/site-packages/urllib3/connectionpool.py", line 496, in _make_request
retriever-redis-server  | exec /home/user/comps/retrievers/langchain/redis/run.sh: no such file or directory
embedding-tei-server    |     conn.request(
dataprep-redis-server   | INFO:     Uvicorn running on http://0.0.0.0:6007 (Press CTRL+C to quit)
embedding-tei-server    |   File "/usr/local/lib/python3.11/site-packages/urllib3/connection.py", line 395, in request
embedding-tei-server    |     self.endheaders()
dataprep-redis-server   | [2024-07-01 23:53:53,158] [    INFO] - HTTP server setup successful
embedding-tei-server    |   File "/usr/local/lib/python3.11/http/client.py", line 1289, in endheaders
embedding-tei-server    |     self._send_output(message_body, encode_chunked=encode_chunked)
embedding-tei-server    |   File "/usr/local/lib/python3.11/http/client.py", line 1048, in _send_output
embedding-tei-server    |     self.send(msg)
embedding-tei-server    |   File "/usr/local/lib/python3.11/http/client.py", line 986, in send
retriever-redis-server  | exec /home/user/comps/retrievers/langchain/redis/run.sh: no such file or directory
embedding-tei-server    |     self.connect()
retriever-redis-server  | exec /home/user/comps/retrievers/langchain/redis/run.sh: no such file or directory
retriever-redis-server  | exec /home/user/comps/retrievers/langchain/redis/run.sh: no such file or directory
embedding-tei-server    |   File "/usr/local/lib/python3.11/site-packages/urllib3/connection.py", line 243, in connect
retriever-redis-server  | exec /home/user/comps/retrievers/langchain/redis/run.sh: no such file or directory
embedding-tei-server    |     self.sock = self._new_conn()
embedding-tei-server    |                 ^^^^^^^^^^^^^^^^
embedding-tei-server    |   File "/usr/local/lib/python3.11/site-packages/urllib3/connection.py", line 218, in _new_conn
embedding-tei-server    |     raise NewConnectionError(
embedding-tei-server    | urllib3.exceptions.NewConnectionError: <urllib3.connection.HTTPConnection object at 0x7fb3994f7dd0>: Failed to establish a new connection: [Errno 111] Connection refused
retriever-redis-server  | exec /home/user/comps/retrievers/langchain/redis/run.sh: no such file or directory
retriever-redis-server  | exec /home/user/comps/retrievers/langchain/redis/run.sh: no such file or directory
retriever-redis-server  | exec /home/user/comps/retrievers/langchain/redis/run.sh: no such file or directory
embedding-tei-server    |
retriever-redis-server  | exec /home/user/comps/retrievers/langchain/redis/run.sh: no such file or directory
embedding-tei-server    | The above exception was the direct cause of the following exception:
retriever-redis-server  | exec /home/user/comps/retrievers/langchain/redis/run.sh: no such file or directory
embedding-tei-server    |
retriever-redis-server  | exec /home/user/comps/retrievers/langchain/redis/run.sh: no such file or directory
retriever-redis-server  | exec /home/user/comps/retrievers/langchain/redis/run.sh: no such file or directory
embedding-tei-server    | Traceback (most recent call last):
retriever-redis-server  | exec /home/user/comps/retrievers/langchain/redis/run.sh: no such file or directory
embedding-tei-server    |   File "/usr/local/lib/python3.11/site-packages/requests/adapters.py", line 486, in send
embedding-tei-server    |     resp = conn.urlopen(
embedding-tei-server    |            ^^^^^^^^^^^^^
embedding-tei-server    |   File "/usr/local/lib/python3.11/site-packages/urllib3/connectionpool.py", line 844, in urlopen
embedding-tei-server    |     retries = retries.increment(
embedding-tei-server    |               ^^^^^^^^^^^^^^^^^^
embedding-tei-server    |   File "/usr/local/lib/python3.11/site-packages/urllib3/util/retry.py", line 515, in increment
embedding-tei-server    |     raise MaxRetryError(_pool, url, reason) from reason  # type: ignore[arg-type]
embedding-tei-server    |     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
embedding-tei-server    | urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=6006): Max retries exceeded with url: / (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7fb3994f7dd0>: Failed to establish a new connection: [Errno 111] Connection refused'))
embedding-tei-server    |
embedding-tei-server    | During handling of the above exception, another exception occurred:
embedding-tei-server    |
embedding-tei-server    | Traceback (most recent call last):
embedding-tei-server    |   File "/home/user/.local/lib/python3.11/site-packages/uvicorn/protocols/http/httptools_impl.py", line 399, in run_asgi
embedding-tei-server    |     result = await app(  # type: ignore[func-returns-value]
retriever-redis-server  | exec /home/user/comps/retrievers/langchain/redis/run.sh: no such file or directory
embedding-tei-server    |              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
embedding-tei-server    |   File "/home/user/.local/lib/python3.11/site-packages/uvicorn/middleware/proxy_headers.py", line 70, in __call__
embedding-tei-server    |     return await self.app(scope, receive, send)
retriever-redis-server  | exec /home/user/comps/retrievers/langchain/redis/run.sh: no such file or directory
embedding-tei-server    |            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
retriever-redis-server  | exec /home/user/comps/retrievers/langchain/redis/run.sh: no such file or directory
embedding-tei-server    |   File "/home/user/.local/lib/python3.11/site-packages/fastapi/applications.py", line 1054, in __call__
retriever-redis-server  | exec /home/user/comps/retrievers/langchain/redis/run.sh: no such file or directory
embedding-tei-server    |     await super().__call__(scope, receive, send)
retriever-redis-server  | exec /home/user/comps/retrievers/langchain/redis/run.sh: no such file or directory
retriever-redis-server  | exec /home/user/comps/retrievers/langchain/redis/run.sh: no such file or directory
retriever-redis-server  | exec /home/user/comps/retrievers/langchain/redis/run.sh: no such file or directory
embedding-tei-server    |   File "/home/user/.local/lib/python3.11/site-packages/starlette/applications.py", line 123, in __call__
embedding-tei-server    |     await self.middleware_stack(scope, receive, send)
embedding-tei-server    |   File "/home/user/.local/lib/python3.11/site-packages/starlette/middleware/errors.py", line 186, in __call__
embedding-tei-server    |     raise exc
embedding-tei-server    |   File "/home/user/.local/lib/python3.11/site-packages/starlette/middleware/errors.py", line 164, in __call__
embedding-tei-server    |     await self.app(scope, receive, _send)
embedding-tei-server    |   File "/home/user/.local/lib/python3.11/site-packages/prometheus_fastapi_instrumentator/middleware.py", line 174, in __call__
embedding-tei-server    |     raise exc
embedding-tei-server    |   File "/home/user/.local/lib/python3.11/site-packages/prometheus_fastapi_instrumentator/middleware.py", line 172, in __call__
embedding-tei-server    |     await self.app(scope, receive, send_wrapper)
embedding-tei-server    |   File "/home/user/.local/lib/python3.11/site-packages/starlette/middleware/cors.py", line 85, in __call__
embedding-tei-server    |     await self.app(scope, receive, send)
embedding-tei-server    |   File "/home/user/.local/lib/python3.11/site-packages/starlette/middleware/exceptions.py", line 65, in __call__
embedding-tei-server    |     await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
embedding-tei-server    |   File "/home/user/.local/lib/python3.11/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
embedding-tei-server    |     raise exc
embedding-tei-server    |   File "/home/user/.local/lib/python3.11/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
embedding-tei-server    |     await app(scope, receive, sender)
embedding-tei-server    |   File "/home/user/.local/lib/python3.11/site-packages/starlette/routing.py", line 756, in __call__
embedding-tei-server    |     await self.middleware_stack(scope, receive, send)
embedding-tei-server    |   File "/home/user/.local/lib/python3.11/site-packages/starlette/routing.py", line 776, in app
embedding-tei-server    |     await route.handle(scope, receive, send)
embedding-tei-server    |   File "/home/user/.local/lib/python3.11/site-packages/starlette/routing.py", line 297, in handle
embedding-tei-server    |     await self.app(scope, receive, send)
embedding-tei-server    |   File "/home/user/.local/lib/python3.11/site-packages/starlette/routing.py", line 77, in app
embedding-tei-server    |     await wrap_app_handling_exceptions(app, request)(scope, receive, send)
retriever-redis-server  | exec /home/user/comps/retrievers/langchain/redis/run.sh: no such file or directory
retriever-redis-server  | exec /home/user/comps/retrievers/langchain/redis/run.sh: no such file or directory
retriever-redis-server  | exec /home/user/comps/retrievers/langchain/redis/run.sh: no such file or directory
embedding-tei-server    |   File "/home/user/.local/lib/python3.11/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
retriever-redis-server  | exec /home/user/comps/retrievers/langchain/redis/run.sh: no such file or directory
retriever-redis-server  | exec /home/user/comps/retrievers/langchain/redis/run.sh: no such file or directory
embedding-tei-server    |     raise exc
embedding-tei-server    |   File "/home/user/.local/lib/python3.11/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
retriever-redis-server  | exec /home/user/comps/retrievers/langchain/redis/run.sh: no such file or directory
embedding-tei-server    |     await app(scope, receive, sender)
embedding-tei-server    |   File "/home/user/.local/lib/python3.11/site-packages/starlette/routing.py", line 72, in app
embedding-tei-server    |     response = await func(request)
embedding-tei-server    |                ^^^^^^^^^^^^^^^^^^^
embedding-tei-server    |   File "/home/user/.local/lib/python3.11/site-packages/fastapi/routing.py", line 278, in app
embedding-tei-server    |     raw_response = await run_endpoint_function(
embedding-tei-server    |                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
embedding-tei-server    |   File "/home/user/.local/lib/python3.11/site-packages/fastapi/routing.py", line 193, in run_endpoint_function
embedding-tei-server    |     return await run_in_threadpool(dependant.call, **values)
embedding-tei-server    |            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
embedding-tei-server    |   File "/home/user/.local/lib/python3.11/site-packages/starlette/concurrency.py", line 42, in run_in_threadpool
embedding-tei-server    |     return await anyio.to_thread.run_sync(func, *args)
embedding-tei-server    |            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
embedding-tei-server    |   File "/usr/local/lib/python3.11/site-packages/anyio/to_thread.py", line 56, in run_sync
embedding-tei-server    |     return await get_async_backend().run_sync_in_worker_thread(
embedding-tei-server    |            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
embedding-tei-server    |   File "/usr/local/lib/python3.11/site-packages/anyio/_backends/_asyncio.py", line 2134, in run_sync_in_worker_thread
embedding-tei-server    |     return await future
embedding-tei-server    |            ^^^^^^^^^^^^
embedding-tei-server    |   File "/usr/local/lib/python3.11/site-packages/anyio/_backends/_asyncio.py", line 851, in run
embedding-tei-server    |     result = context.run(func, *args)
embedding-tei-server    |              ^^^^^^^^^^^^^^^^^^^^^^^^
embedding-tei-server    |   File "/usr/local/lib/python3.11/site-packages/langsmith/run_helpers.py", line 399, in wrapper
embedding-tei-server    |     raise e
embedding-tei-server    |   File "/usr/local/lib/python3.11/site-packages/langsmith/run_helpers.py", line 395, in wrapper
embedding-tei-server    |     function_result = func(*args, **kwargs)
embedding-tei-server    |                       ^^^^^^^^^^^^^^^^^^^^^
embedding-tei-server    |   File "/home/user/comps/embeddings/langchain/embedding_tei.py", line 34, in embedding
embedding-tei-server    |     embed_vector = embeddings.embed_query(input.text)
embedding-tei-server    |                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
embedding-tei-server    |   File "/usr/local/lib/python3.11/site-packages/langchain_community/embeddings/huggingface_hub.py", line 109, in embed_query
embedding-tei-server    |     response = self.embed_documents([text])[0]
embedding-tei-server    |                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
embedding-tei-server    |   File "/usr/local/lib/python3.11/site-packages/langchain_community/embeddings/huggingface_hub.py", line 95, in embed_documents
embedding-tei-server    |     responses = self.client.post(
embedding-tei-server    |                 ^^^^^^^^^^^^^^^^^
embedding-tei-server    |   File "/home/user/.local/lib/python3.11/site-packages/huggingface_hub/inference/_client.py", line 259, in post
embedding-tei-server    |     response = get_session().post(
embedding-tei-server                 |                ^^^^^^^^^^^^^^^^^^^
embedding-tei-server                 |   File "/usr/local/lib/python3.11/site-packages/requests/sessions.py", line 637, in post
chatqna-xeon-conversation-ui-server  | 2024/07/01 23:53:46 [notice] 1#1: using the "epoll" event method
chatqna-xeon-conversation-ui-server  | 2024/07/01 23:53:46 [notice] 1#1: nginx/1.27.0
chatqna-xeon-conversation-ui-server  | 2024/07/01 23:53:46 [notice] 1#1: built by gcc 13.2.1 20231014 (Alpine 13.2.1_git20231014)
chatqna-xeon-conversation-ui-server  | 2024/07/01 23:53:46 [notice] 1#1: OS: Linux 5.15.153.1-microsoft-standard-WSL2
chatqna-xeon-conversation-ui-server  | 2024/07/01 23:53:46 [notice] 1#1: getrlimit(RLIMIT_NOFILE): 1048576:1048576
chatqna-xeon-conversation-ui-server  | 2024/07/01 23:53:46 [notice] 1#1: start worker processes
chatqna-xeon-conversation-ui-server  | 2024/07/01 23:53:46 [notice] 1#1: start worker process 7
chatqna-xeon-conversation-ui-server  | 2024/07/01 23:53:46 [notice] 1#1: start worker process 8
chatqna-xeon-conversation-ui-server  | 2024/07/01 23:53:46 [notice] 1#1: start worker process 9
chatqna-xeon-conversation-ui-server  | 2024/07/01 23:53:46 [notice] 1#1: start worker process 10
chatqna-xeon-conversation-ui-server  | 2024/07/01 23:53:46 [notice] 1#1: start worker process 11
chatqna-xeon-conversation-ui-server  | 2024/07/01 23:53:46 [notice] 1#1: start worker process 12
chatqna-xeon-conversation-ui-server  | 2024/07/01 23:53:46 [notice] 1#1: start worker process 13
retriever-redis-server  | exec /home/user/comps/retrievers/langchain/redis/run.sh: no such file or directory
chatqna-xeon-conversation-ui-server  | 2024/07/01 23:53:46 [notice] 1#1: start worker process 14
chatqna-xeon-conversation-ui-server  | 2024/07/01 23:53:46 [notice] 1#1: start worker process 15
chatqna-xeon-conversation-ui-server  | 2024/07/01 23:53:46 [notice] 1#1: start worker process 16
chatqna-xeon-conversation-ui-server  | 2024/07/01 23:53:46 [notice] 1#1: start worker process 17
chatqna-xeon-conversation-ui-server  | 2024/07/01 23:53:46 [notice] 1#1: start worker process 18
chatqna-xeon-conversation-ui-server  | 2024/07/01 23:53:46 [notice] 1#1: start worker process 19
chatqna-xeon-conversation-ui-server  | 2024/07/01 23:53:46 [notice] 1#1: start worker process 20
retriever-redis-server               | exec /home/user/comps/retrievers/langchain/redis/run.sh: no such file or directory
retriever-redis-server               | exec /home/user/comps/retrievers/langchain/redis/run.sh: no such file or directory
embedding-tei-server                 |     return self.request("POST", url, data=data, json=json, **kwargs)
chatqna-xeon-conversation-ui-server  | 2024/07/01 23:53:46 [notice] 1#1: start worker process 21
retriever-redis-server               | exec /home/user/comps/retrievers/langchain/redis/run.sh: no such file or directory
embedding-tei-server                 |            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
retriever-redis-server               | exec /home/user/comps/retrievers/langchain/redis/run.sh: no such file or directory
embedding-tei-server                 |   File "/usr/local/lib/python3.11/site-packages/requests/sessions.py", line 589, in request
chatqna-xeon-conversation-ui-server  | 2024/07/01 23:53:46 [notice] 1#1: start worker process 22
retriever-redis-server               | exec /home/user/comps/retrievers/langchain/redis/run.sh: no such file or directory
chatqna-xeon-conversation-ui-server  | 2024/07/01 23:53:46 [notice] 1#1: start worker process 23
retriever-redis-server               | exec /home/user/comps/retrievers/langchain/redis/run.sh: no such file or directory
retriever-redis-server               | exec /home/user/comps/retrievers/langchain/redis/run.sh: no such file or directory
retriever-redis-server               | exec /home/user/comps/retrievers/langchain/redis/run.sh: no such file or directory
retriever-redis-server               | exec /home/user/comps/retrievers/langchain/redis/run.sh: no such file or directory
retriever-redis-server               | exec /home/user/comps/retrievers/langchain/redis/run.sh: no such file or directory
retriever-redis-server               | exec /home/user/comps/retrievers/langchain/redis/run.sh: no such file or directory
retriever-redis-server               | exec /home/user/comps/retrievers/langchain/redis/run.sh: no such file or directory
retriever-redis-server               | exec /home/user/comps/retrievers/langchain/redis/run.sh: no such file or directory
chatqna-xeon-conversation-ui-server  | 2024/07/01 23:53:46 [notice] 1#1: start worker process 24
retriever-redis-server               | exec /home/user/comps/retrievers/langchain/redis/run.sh: no such file or directory
embedding-tei-server                 |     resp = self.send(prep, **send_kwargs)
chatqna-xeon-conversation-ui-server  | 2024/07/01 23:53:46 [notice] 1#1: start worker process 25
embedding-tei-server                 |            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
embedding-tei-server                 |   File "/usr/local/lib/python3.11/site-packages/requests/sessions.py", line 703, in send
chatqna-xeon-conversation-ui-server  | 2024/07/01 23:53:46 [notice] 1#1: start worker process 26
chatqna-xeon-conversation-ui-server  | 2024/07/01 23:53:46 [notice] 1#1: start worker process 27
retriever-redis-server               | exec /home/user/comps/retrievers/langchain/redis/run.sh: no such file or directory
chatqna-xeon-conversation-ui-server  | 2024/07/01 23:53:46 [notice] 1#1: start worker process 28
embedding-tei-server                 |     r = adapter.send(request, **kwargs)
embedding-tei-server                 |         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
embedding-tei-server                 |   File "/home/user/.local/lib/python3.11/site-packages/huggingface_hub/utils/_http.py", line 66, in send
embedding-tei-server                 |     return super().send(request, *args, **kwargs)
embedding-tei-server                 |            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
embedding-tei-server                 |   File "/usr/local/lib/python3.11/site-packages/requests/adapters.py", line 519, in send
embedding-tei-server                 |     raise ConnectionError(e, request=request)
retriever-redis-server               | exec /home/user/comps/retrievers/langchain/redis/run.sh: no such file or directory
embedding-tei-server                 | requests.exceptions.ConnectionError: (MaxRetryError("HTTPConnectionPool(host='localhost', port=6006): Max retries exceeded with url: / (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7fb3994f7dd0>: Failed to establish a new connection: [Errno 111] Connection refused'))"), '(Request ID: 691b623f-7d8b-4a39-936c-5016f73e7fe5)')
retriever-redis-server               | exec /home/user/comps/retrievers/langchain/redis/run.sh: no such file or directory
retriever-redis-server               | exec /home/user/comps/retrievers/langchain/redis/run.sh: no such file or directory
retriever-redis-server               | exec /home/user/comps/retrievers/langchain/redis/run.sh: no such file or directory
retriever-redis-server               | exec /home/user/comps/retrievers/langchain/redis/run.sh: no such file or directory
retriever-redis-server               | exec /home/user/comps/retrievers/langchain/redis/run.sh: no such file or directory
retriever-redis-server               | exec /home/user/comps/retrievers/langchain/redis/run.sh: no such file or directory
retriever-redis-server               | exec /home/user/comps/retrievers/langchain/redis/run.sh: no such file or directory
reranking-tei-xeon-server            | /home/user/.local/lib/python3.11/site-packages/pydantic/_internal/_fields.py:161: UserWarning: Field "model_name_or_path" has conflict with protected namespace "model_".
reranking-tei-xeon-server            |
reranking-tei-xeon-server            | You may be able to resolve this warning by setting `model_config['protected_namespaces'] = ()`.
reranking-tei-xeon-server            |   warnings.warn(
reranking-tei-xeon-server            | [2024-07-01 23:53:49,339] [    INFO] - CORS is enabled.
reranking-tei-xeon-server            | [2024-07-01 23:53:49,340] [    INFO] - Setting up HTTP server
reranking-tei-xeon-server            | [2024-07-01 23:53:49,341] [    INFO] - Uvicorn server setup on port 8000
reranking-tei-xeon-server            | INFO:     Waiting for application startup.
reranking-tei-xeon-server            | INFO:     Application startup complete.
reranking-tei-xeon-server            | INFO:     Uvicorn running on http://0.0.0.0:8000 (Press CTRL+C to quit)
reranking-tei-xeon-server            | [2024-07-01 23:53:49,371] [    INFO] - HTTP server setup successful
chatqna-xeon-backend-server          | /usr/local/lib/python3.11/site-packages/pydantic/_internal/_fields.py:161: UserWarning: Field "model_name_or_path" has conflict with protected namespace "model_".
chatqna-xeon-backend-server          |
chatqna-xeon-backend-server          | You may be able to resolve this warning by setting `model_config['protected_namespaces'] = ()`.
chatqna-xeon-backend-server          |   warnings.warn(
chatqna-xeon-backend-server          | [2024-07-01 23:53:47,386] [    INFO] - CORS is enabled.
chatqna-xeon-backend-server          | [2024-07-01 23:53:47,389] [    INFO] - Setting up HTTP server
chatqna-xeon-backend-server          | [2024-07-01 23:53:47,390] [    INFO] - Uvicorn server setup on port 8888
chatqna-xeon-backend-server          | INFO:     Waiting for application startup.
chatqna-xeon-backend-server          | INFO:     Application startup complete.
chatqna-xeon-backend-server          | INFO:     Uvicorn running on http://0.0.0.0:8888 (Press CTRL+C to quit)
chatqna-xeon-backend-server          | [2024-07-01 23:53:47,435] [    INFO] - HTTP server setup successful
tei-reranking-server                 | 2024-07-01T23:53:43.243758Z  INFO text_embeddings_router: router/src/main.rs:140: Args { model_id: "BAA*/***-********-*ase", revision: None, tokenization_workers: None, dtype: None, pooling: None, max_concurrent_requests: 512, max_batch_tokens: 16384, max_batch_requests: None, max_client_batch_size: 32, auto_truncate: true, hf_api_token: None, hostname: "c9522f37e48c", port: 80, uds_path: "/tmp/text-embeddings-inference-server", huggingface_hub_cache: Some("/data"), payload_limit: 2000000, api_key: None, json_output: false, otlp_endpoint: None, cors_allow_origin: None }
tei-reranking-server                 | 2024-07-01T23:53:43.243884Z  INFO hf_hub: /usr/local/cargo/git/checkouts/hf-hub-1aadb4c6e2cbe1ba/b167f69/src/lib.rs:55: Token file not found "/root/.cache/huggingface/token"
tei-reranking-server                 | 2024-07-01T23:53:44.551204Z  INFO download_artifacts: text_embeddings_core::download: core/src/download.rs:20: Starting download
tei-reranking-server                 | 2024-07-01T23:54:23.530040Z  INFO download_artifacts: text_embeddings_core::download: core/src/download.rs:37: Model artifacts downloaded in 38.98099486s
tei-reranking-server                 | 2024-07-01T23:54:24.026409Z  WARN text_embeddings_router: router/src/lib.rs:165: Could not find a Sentence Transformers config
tei-reranking-server                 | 2024-07-01T23:54:24.026432Z  INFO text_embeddings_router: router/src/lib.rs:169: Maximum number of tokens per request: 512
tei-reranking-server                 | 2024-07-01T23:54:24.026763Z  INFO text_embeddings_core::tokenization: core/src/tokenization.rs:23: Starting 11 tokenization workers
tei-reranking-server                 | 2024-07-01T23:54:25.973239Z  INFO text_embeddings_router: router/src/lib.rs:194: Starting model backend
tei-reranking-server                 | 2024-07-01T23:54:25.997699Z  INFO text_embeddings_backend_candle: backends/candle/src/lib.rs:132: Starting Bert model on Cpu
tei-reranking-server                 | 2024-07-01T23:55:08.569986Z  WARN text_embeddings_router: router/src/lib.rs:211: Backend does not support a batch size > 4
tei-reranking-server                 | 2024-07-01T23:55:08.570028Z  WARN text_embeddings_router: router/src/lib.rs:212: forcing `max_batch_requests=4`
tei-reranking-server                 | 2024-07-01T23:55:08.570288Z  WARN text_embeddings_router: router/src/lib.rs:263: Invalid hostname, defaulting to 0.0.0.0
tei-reranking-server                 | 2024-07-01T23:55:08.616914Z  INFO text_embeddings_router::http::server: router/src/http/server.rs:1555: Starting HTTP server: 0.0.0.0:80
tei-reranking-server                 | 2024-07-01T23:55:08.616942Z  INFO text_embeddings_router::http::server: router/src/http/server.rs:1556: Ready
tei-embedding-server                 | 2024-07-01T23:53:43.255542Z  INFO text_embeddings_router: router/src/main.rs:140: Args { model_id: "BAA*/***-****-**-v1.5", revision: None, tokenization_workers: None, dtype: None, pooling: None, max_concurrent_requests: 512, max_batch_tokens: 16384, max_batch_requests: None, max_client_batch_size: 32, auto_truncate: true, hf_api_token: None, hostname: "3be8142797f7", port: 80, uds_path: "/tmp/text-embeddings-inference-server", huggingface_hub_cache: Some("/data"), payload_limit: 2000000, api_key: None, json_output: false, otlp_endpoint: None, cors_allow_origin: None }
tei-embedding-server                 | 2024-07-01T23:53:43.255646Z  INFO hf_hub: /usr/local/cargo/git/checkouts/hf-hub-1aadb4c6e2cbe1ba/b167f69/src/lib.rs:55: Token file not found "/root/.cache/huggingface/token"
tei-embedding-server                 | 2024-07-01T23:53:44.350622Z  INFO download_artifacts: text_embeddings_core::download: core/src/download.rs:20: Starting download
tei-embedding-server                 | 2024-07-01T23:54:18.879425Z  INFO download_artifacts: text_embeddings_core::download: core/src/download.rs:37: Model artifacts downloaded in 34.530238979s
tei-embedding-server                 | 2024-07-01T23:54:18.903716Z  INFO text_embeddings_router: router/src/lib.rs:169: Maximum number of tokens per request: 512
tei-embedding-server                 | 2024-07-01T23:54:18.904131Z  INFO text_embeddings_core::tokenization: core/src/tokenization.rs:23: Starting 11 tokenization workers
tei-embedding-server                 | 2024-07-01T23:54:18.935899Z  INFO text_embeddings_router: router/src/lib.rs:194: Starting model backend
tei-embedding-server                 | 2024-07-01T23:54:18.950913Z  INFO text_embeddings_backend_candle: backends/candle/src/lib.rs:124: Starting Bert model on Cpu
tei-embedding-server                 | 2024-07-01T23:54:38.454590Z  WARN text_embeddings_router: router/src/lib.rs:211: Backend does not support a batch size > 4
tei-embedding-server                 | 2024-07-01T23:54:38.454617Z  WARN text_embeddings_router: router/src/lib.rs:212: forcing `max_batch_requests=4`
llm-tgi-server                       | /usr/local/lib/python3.11/site-packages/pydantic/_internal/_fields.py:149: UserWarning: Field "model_name_or_path" has conflict with protected namespace "model_".
llm-tgi-server                       |
chatqna-xeon-ui-server               |
redis-vector-db                      | 9:C 01 Jul 2024 23:53:43.257 * oO0OoO0OoO0Oo Redis is starting oO0OoO0OoO0Oo
chatqna-xeon-ui-server               | > [email protected] preview
redis-vector-db                      | 9:C 01 Jul 2024 23:53:43.257 * Redis version=7.2.4, bits=64, commit=00000000, modified=0, pid=9, just started
tei-embedding-server                 | 2024-07-01T23:54:38.455129Z  WARN text_embeddings_router: router/src/lib.rs:263: Invalid hostname, defaulting to 0.0.0.0
redis-vector-db                      | 9:C 01 Jul 2024 23:53:43.257 * Configuration loaded
tei-embedding-server                 | 2024-07-01T23:54:38.462402Z  INFO text_embeddings_router::http::server: router/src/http/server.rs:1555: Starting HTTP server: 0.0.0.0:80
llm-tgi-server                       | You may be able to resolve this warning by setting `model_config['protected_namespaces'] = ()`.
llm-tgi-server                       |   warnings.warn(
llm-tgi-server                       | [2024-07-01 23:53:47,613] [    INFO] - CORS is enabled.
llm-tgi-server                       | [2024-07-01 23:53:47,614] [    INFO] - Setting up HTTP server
llm-tgi-server                       | [2024-07-01 23:53:47,615] [    INFO] - Uvicorn server setup on port 9000
llm-tgi-server                       | INFO:     Waiting for application startup.
llm-tgi-server                       | INFO:     Application startup complete.
llm-tgi-server                       | INFO:     Uvicorn running on http://0.0.0.0:9000 (Press CTRL+C to quit)
llm-tgi-server                       | [2024-07-01 23:53:47,655] [    INFO] - HTTP server setup successful
tei-embedding-server                 | 2024-07-01T23:54:38.462483Z  INFO text_embeddings_router::http::server: router/src/http/server.rs:1556: Ready
tei-embedding-server                 | 2024-07-01T23:57:52.633781Z  INFO embed{total_time="28.925152ms" tokenization_time="1.179977ms" queue_time="429.605µs" inference_time="26.819273ms"}: text_embeddings_router::http::server: router/src/http/server.rs:590: Success
redis-vector-db                      | 9:M 01 Jul 2024 23:53:43.257 * monotonic clock: POSIX clock_gettime
redis-vector-db                      | 9:M 01 Jul 2024 23:53:43.258 * Running mode=standalone, port=6379.
redis-vector-db                      | 9:M 01 Jul 2024 23:53:43.258 * Module 'RedisCompat' loaded from /opt/redis-stack/lib/rediscompat.so
redis-vector-db                      | 9:M 01 Jul 2024 23:53:43.259 * <search> Redis version found by RedisSearch : 7.2.4 - oss
redis-vector-db                      | 9:M 01 Jul 2024 23:53:43.259 * <search> RediSearch version 2.8.12 (Git=2.8-32fdaca)
redis-vector-db                      | 9:M 01 Jul 2024 23:53:43.259 * <search> Low level api version 1 initialized successfully
redis-vector-db                      | 9:M 01 Jul 2024 23:53:43.259 * <search> concurrent writes: OFF, gc: ON, prefix min length: 2, prefix max expansions: 200, query timeout (ms): 500, timeout policy: return, cursor read size: 1000, cursor max idle (ms): 300000, max doctable size: 1000000, max number of search results:  10000, search pool size: 20, index pool size: 8,
redis-vector-db                      | 9:M 01 Jul 2024 23:53:43.260 * <search> Initialized thread pools!
redis-vector-db                      | 9:M 01 Jul 2024 23:53:43.260 * <search> Enabled role change notification
redis-vector-db                      | 9:M 01 Jul 2024 23:53:43.260 * Module 'search' loaded from /opt/redis-stack/lib/redisearch.so
redis-vector-db                      | 9:M 01 Jul 2024 23:53:43.262 * <timeseries> RedisTimeSeries version 11011, git_sha=0299ac12a6bf298028859c41ba0f4d8dc842726b
redis-vector-db                      | 9:M 01 Jul 2024 23:53:43.262 * <timeseries> Redis version found by RedisTimeSeries : 7.2.4 - oss
chatqna-xeon-ui-server               | > vite preview --port 5173 --host 0.0.0.0
redis-vector-db                      | 9:M 01 Jul 2024 23:53:43.262 * <timeseries> loaded default CHUNK_SIZE_BYTES policy: 4096
chatqna-xeon-ui-server               |
redis-vector-db                      | 9:M 01 Jul 2024 23:53:43.262 * <timeseries> loaded server DUPLICATE_POLICY: block
redis-vector-db                      | 9:M 01 Jul 2024 23:53:43.262 * <timeseries> Setting default series ENCODING to: compressed
redis-vector-db                      | 9:M 01 Jul 2024 23:53:43.263 * <timeseries> Detected redis oss
redis-vector-db                      | 9:M 01 Jul 2024 23:53:43.263 * Module 'timeseries' loaded from /opt/redis-stack/lib/redistimeseries.so
redis-vector-db                      | 9:M 01 Jul 2024 23:53:43.263 * <ReJSON> Created new data type 'ReJSON-RL'
chatqna-xeon-ui-server               |
chatqna-xeon-ui-server               |   ➜  Local:   http://localhost:5173/
chatqna-xeon-ui-server               |   ➜  Network: http://172.18.0.13:5173/
redis-vector-db                      | 9:M 01 Jul 2024 23:53:43.263 * <ReJSON> version: 20609 git sha: unknown branch: unknown
redis-vector-db                      | 9:M 01 Jul 2024 23:53:43.263 * <ReJSON> Exported RedisJSON_V1 API
redis-vector-db                      | 9:M 01 Jul 2024 23:53:43.263 * <ReJSON> Exported RedisJSON_V2 API
redis-vector-db                      | 9:M 01 Jul 2024 23:53:43.263 * <ReJSON> Exported RedisJSON_V3 API
redis-vector-db                      | 9:M 01 Jul 2024 23:53:43.263 * <ReJSON> Exported RedisJSON_V4 API
redis-vector-db                      | 9:M 01 Jul 2024 23:53:43.263 * <ReJSON> Exported RedisJSON_V5 API
redis-vector-db                      | 9:M 01 Jul 2024 23:53:43.263 * <ReJSON> Enabled diskless replication
redis-vector-db                      | 9:M 01 Jul 2024 23:53:43.263 * Module 'ReJSON' loaded from /opt/redis-stack/lib/rejson.so
redis-vector-db                      | 9:M 01 Jul 2024 23:53:43.263 * <search> Acquired RedisJSON_V5 API
redis-vector-db                      | 9:M 01 Jul 2024 23:53:43.263 * <bf> RedisBloom version 2.6.12 (Git=unknown)
redis-vector-db                      | 9:M 01 Jul 2024 23:53:43.263 * Module 'bf' loaded from /opt/redis-stack/lib/redisbloom.so
redis-vector-db                      | 9:M 01 Jul 2024 23:53:43.264 * <redisgears_2> Created new data type 'GearsType'
redis-vector-db                      | 9:M 01 Jul 2024 23:53:43.264 * <redisgears_2> Detected redis oss
redis-vector-db                      | 9:M 01 Jul 2024 23:53:43.264 # <redisgears_2> could not initialize RedisAI_InitError
redis-vector-db                      |
redis-vector-db                      | 9:M 01 Jul 2024 23:53:43.264 * <redisgears_2> Failed loading RedisAI API.
redis-vector-db                      | 9:M 01 Jul 2024 23:53:43.264 * <redisgears_2> RedisGears v2.0.19, sha='671030bbcb7de4582d00575a0902f826da3efe73', build_type='release', built_for='Linux-ubuntu22.04.x86_64'.
redis-vector-db                      | 9:M 01 Jul 2024 23:53:43.265 * <redisgears_2> Registered backend: js.
redis-vector-db                      | 9:M 01 Jul 2024 23:53:43.265 * Module 'redisgears_2' loaded from /opt/redis-stack/lib/redisgears.so
redis-vector-db                      | 9:M 01 Jul 2024 23:53:43.265 * Server initialized
redis-vector-db                      | 9:M 01 Jul 2024 23:53:43.265 * Ready to accept connections tcp
tgi-service                          | 2024-07-01T23:53:43.268878Z  INFO text_generation_launcher: Args { model_id: "Intel/neural-chat-7b-v3-3", revision: None, validation_workers: 2, sharded: None, num_shard: None, quantize: None, speculate: None, dtype: None, trust_remote_code: false, max_concurrent_requests: 128, max_best_of: 2, max_stop_sequences: 4, max_top_n_tokens: 5, max_input_length: 1024, max_total_tokens: 2048, waiting_served_ratio: 1.2, max_batch_prefill_tokens: 4096, max_batch_total_tokens: None, max_waiting_tokens: 20, max_batch_size: None, enable_cuda_graphs: false, hostname: "aa74d3e3bffd", port: 80, shard_uds_path: "/tmp/text-generation-server", master_addr: "localhost", master_port: 29500, huggingface_hub_cache: Some("/data"), weights_cache_override: None, disable_custom_kernels: false, cuda_memory_fraction: 1.0, rope_scaling: None, rope_factor: None, json_output: false, otlp_endpoint: None, cors_allow_origin: [], watermark_gamma: None, watermark_delta: None, ngrok: false, ngrok_authtoken: None, ngrok_edge: None, tokenizer_config_path: None, disable_grammar_support: false, env: false }
tgi-service                          | 2024-07-01T23:53:43.268998Z  INFO download: text_generation_launcher: Starting download process.
tgi-service                          | 2024-07-01T23:53:47.829638Z  WARN text_generation_launcher: No safetensors weights found for model Intel/neural-chat-7b-v3-3 at revision None. Downloading PyTorch weights.
tgi-service                          |
tgi-service                          | 2024-07-01T23:53:48.073202Z  INFO text_generation_launcher: Download file: pytorch_model-00001-of-00002.bin
tgi-service                          |
tgi-service                          | 2024-07-01T23:57:16.361585Z  INFO text_generation_launcher: Downloaded /data/models--Intel--neural-chat-7b-v3-3/snapshots/bdd31cf498d13782cc7497cba5896996ce429f91/pytorch_model-00001-of-00002.bin in 0:03:28.
tgi-service                          |
tgi-service                          | 2024-07-01T23:57:16.361995Z  INFO text_generation_launcher: Download: [1/2] -- ETA: 0:03:28
tgi-service                          |
tgi-service                          | 2024-07-01T23:57:16.369007Z  INFO text_generation_launcher: Download file: pytorch_model-00002-of-00002.bin
tgi-service                          |
tgi-service                          | 2024-07-01T23:58:43.147211Z  INFO text_generation_launcher: Downloaded /data/models--Intel--neural-chat-7b-v3-3/snapshots/bdd31cf498d13782cc7497cba5896996ce429f91/pytorch_model-00002-of-00002.bin in 0:01:26.
tgi-service                          |
tgi-service                          | 2024-07-01T23:58:43.147256Z  INFO text_generation_launcher: Download: [2/2] -- ETA: 0
tgi-service                          |
tgi-service                          | 2024-07-01T23:58:43.147354Z  WARN text_generation_launcher: No safetensors weights found for model Intel/neural-chat-7b-v3-3 at revision None. Converting PyTorch weights to safetensors.
tgi-service                          |
tgi-service                          | 2024-07-01T23:59:53.587598Z ERROR download: text_generation_launcher: Download process was signaled to shutdown with signal 9:
tgi-service                          | The cache for model files in Transformers v4.22.0 has been updated. Migrating your old cache. This is a one-time only operation. You can interrupt this and resume the migration later on by calling `transformers.utils.move_cache()`.
tgi-service                          | Error: DownloadError
@arun-gupta arun-gupta changed the title "415 internal server error" for testing microservices "415 internal server error" for testing microservices on Windows Jul 2, 2024
@yinghu5 yinghu5 added the aitce label Jul 2, 2024
@hshen14
Copy link
Contributor

hshen14 commented Jul 2, 2024

This is Xeon BKC, which is not applicable to AIPC. Please refer to #356

@hshen14 hshen14 closed this as completed Jul 2, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

4 participants