Steps to replicate the issue:
- Put webservice inside a Build Service created container on the Toolforge cluster
- There are a number of ways to do this, but likely the easiest is to put git+https://gitlab.wikimedia.org/repos/cloud/toolforge/tools-webservice in a requirements.txt file and then build the image
- webservice status
What happens?:
$ webservice status Traceback (most recent call last): File "/layers/heroku_python/dependencies/bin/webservice", line 8, in <module> sys.exit(main()) ^^^^^^ File "/layers/heroku_python/dependencies/lib/python3.12/site-packages/toolsws/cli/webservice.py", line 227, in main tool = Tool.from_currentuser() ^^^^^^^^^^^^^^^^^^^^^^^ File "/layers/heroku_python/dependencies/lib/python3.12/site-packages/toolsws/tool.py", line 110, in from_currentuser pwd_entry = pwd.getpwuid(os.geteuid()) ^^^^^^^^^^^^^^^^^^^^^^^^^^ KeyError: 'getpwuid(): uid not found: 55419'
What should have happened instead?:
$ webservice status Your webservice is not running
Software version: 0.103.9
The assumptions about the runtime user and the ability to fetch NSS data on the user were reasonable on the Toolforge bastions and even in our legacy containers with NSS LDAP support, but the recommended modern container solution makes this method of detecting the executing tool impossible.
See also: T369573: `toolforge jobs` requires current user to be the tool user and listed in NSS passwd data