Files
ryobi-crawler/README.md
2025-05-13 18:29:51 +02:00

1.3 KiB

ryobi-crawler

Project start

  1. Clone repository using git clone https://git.techtube.pl/krzysiej/ryobi-crawler.git
  2. Cd into project directory cd ryobi-crawler
  3. Build and start docker container docker compose up -d
  4. Run docker compose exec php-app php console.php app:migrate file to create database.sqlite and create tables.
  5. Run docker compose exec php-app php console.php app:scrape command to scrape all the products from the ryobi website.
  6. Access web interface using localhost:9001 address in web browser.

Update project

  1. Cd into project directory
  2. Run git pull
  3. Refresh cache on production by removing cache directory: rm -rf var/cache
  4. Start and build image in one go with command: docker compose up -d --build --force-recreate

Bonus

Install composer package

  1. Run bin/composer require vendor/package-name

Running Cron

For now only way of running app:scrape command on schedule is to use host crontab.

  1. Run crontab -e command to edit a host crontab job file
  2. Add a new line with e.g. line like this 0 1 * * * cd /var/project/directory/ && docker compose exec php-app php console.php app:scrape
  3. Save and exit file editor. Cron will execute app:scrape once per day.

Screenshots

Main screen of the web view

Main screen of the web view