Reverse Proxy to ENFORCE the robots.txt  against malicious crawlers that don't respect it 
				
			
		| .gitignore | ||
| bible.txt | ||
| config.json | ||
| Dockerfile | ||
| go.mod | ||
| go.sum | ||
| main.go | ||
| ReadMe.md | ||
forgejo-crawler-blocker
if anyone needs to clear the data to unblock someone, these are the commands to run on paimon:
sudo -i
docker stop gitea_forgejo-crawler-blocker_1
rm /etc/docker-compose/gitea/forgejo-crawler-blocker/traffic.db
docker start gitea_forgejo-crawler-blocker_1
persistent data storage
/forgejo-crawler-blocker/data  inside the docker container.
forests manaul build process
Run on server: (paimon)
cd /home/forest/forgejo-crawler-blocker && git pull sequentialread main  && cd /etc/docker-compose/gitea && docker stop gitea_forgejo-crawler-blocker_1 || true && docker rm gitea_forgejo-crawler-blocker_1 || true && docker image rm gitea_forgejo-crawler-blocker || true && rm -f forgejo-crawler-blocker/traffic.db && docker-compose up -d && sleep 1 && docker logs -n 1000 -f gitea_forgejo-crawler-blocker_1