Update robots.txt
Per https://blogs.bing.com/webmaster/May-2012/To-crawl-or-not-to-crawl,-that-is-BingBot-s-questi, if there is a bingbot-specific section of the robots.txt file then bingbot will ignore any directives in the default section. That means that bingbot was the only bot able to crawl the /pdf
links in the AMS PUI (at least, the only bot who abides by the robots.txt file).
This MR updates the robots.txt file to prevent bingbot from crawling the /pdf
links. I needed to make a couple of pipeline-related changes to get this to work - one in the archivesspace-container
repo (archivesspace-container!2 (merged)), the other making the provenance-related change to the docker buildx bake
command that was also needed for the dps-frontend-api
(related to https://github.com/docker/buildx/issues/1533).