37 lines
1.3 KiB
Markdown
37 lines
1.3 KiB
Markdown
# ryobi-crawler
|
|
|
|
## Project start
|
|
|
|
1. Clone repository using `git clone https://git.techtube.pl/krzysiej/ryobi-crawler.git`
|
|
2. Cd into project directory `cd ryobi-crawler`
|
|
3. Build and start docker container `docker compose up -d --build --force-recreate`
|
|
4. Run `docker compose exec php-app php console.php app:migrate` file to create `database.sqlite` and create tables.
|
|
5. Run `docker compose exec php-app php console.php app:scrape` command to scrape all the products from the ryobi website.
|
|
6. Access web interface using `localhost:9001` address in web browser.
|
|
|
|
|
|
## Update project
|
|
|
|
1. Cd into project directory
|
|
2. Run `git pull`
|
|
3. Refresh cache on production by removing cache directory: `rm -rf var/cache`
|
|
4. Start and build image in one go with command: `docker compose up -d --build --force-recreate`
|
|
|
|
## Bonus
|
|
|
|
### Install composer package
|
|
|
|
1. Run `bin/composer require vendor/package-name`
|
|
|
|
|
|
## Running Cron
|
|
|
|
For now only way of running `app:scrape` command on schedule is to use host crontab.
|
|
1. Run `crontab -e` command to edit a host crontab job file
|
|
2. Add a new line with e.g. line like this `0 1 * * * cd /var/project/directory/ && docker compose exec php-app php console.php app:scrape`
|
|
3. Save and exit file editor. Cron will execute `app:scrape` once per day.
|
|
|
|
## Screenshots
|
|
|
|
### Main screen of the web view
|
|
 |