Compare commits
95 Commits
0.39.5
...
image-bina
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
9b036d7b19 | ||
|
|
0761984bcd | ||
|
|
e73721a3f0 | ||
|
|
86fc9d669f | ||
|
|
7a66b69158 | ||
|
|
ddd7b2772d | ||
|
|
305060f79c | ||
|
|
cfcf59d009 | ||
|
|
af25b824a0 | ||
|
|
a29085fa18 | ||
|
|
d7832d735d | ||
|
|
7d1c4d7673 | ||
|
|
6e00f0e025 | ||
|
|
4f536bb559 | ||
|
|
38d8aa8d28 | ||
|
|
dec47d5c43 | ||
|
|
cec24fe2c1 | ||
|
|
f4bc0aa2ba | ||
|
|
499c4797da | ||
|
|
9bc71d187e | ||
|
|
536948c8c6 | ||
|
|
d4f4ab306a | ||
|
|
8d2e240a2a | ||
|
|
d7ed479ca2 | ||
|
|
f25cdf0a67 | ||
|
|
5214a7e0f3 | ||
|
|
eb3dca3805 | ||
|
|
a580c238b6 | ||
|
|
7ca89f5ec3 | ||
|
|
8ab8aaa6ae | ||
|
|
22ef9afb93 | ||
|
|
abaec224f6 | ||
|
|
5a645fb74d | ||
|
|
14db60e518 | ||
|
|
e250c552d0 | ||
|
|
8e54a17e14 | ||
|
|
8607eccaad | ||
|
|
17511d0d7d | ||
|
|
41b806228c | ||
|
|
453cf81e1d | ||
|
|
0095b28ea3 | ||
|
|
73101a47e7 | ||
|
|
03f776ca45 | ||
|
|
39b7be9e7a | ||
|
|
6611823962 | ||
|
|
c1c453e4fe | ||
|
|
4887180671 | ||
|
|
ac7378b7fb | ||
|
|
eeba8c864d | ||
|
|
abe88192f4 | ||
|
|
af8efbb6d2 | ||
|
|
bbc2875ef3 | ||
|
|
b7ca10ebac | ||
|
|
a896493797 | ||
|
|
e5fe095f16 | ||
|
|
271181968f | ||
|
|
8206383ee5 | ||
|
|
ecfc02ba23 | ||
|
|
3331ccd061 | ||
|
|
bd8f389a65 | ||
|
|
bc74227635 | ||
|
|
07c60a6acc | ||
|
|
7916faf58b | ||
|
|
febb2bbf0d | ||
|
|
59d31bf76f | ||
|
|
f87f7077a6 | ||
|
|
f166ab1e30 | ||
|
|
55e679e973 | ||
|
|
e211ba806f | ||
|
|
b33105d576 | ||
|
|
b73f5a5c88 | ||
|
|
023951a10e | ||
|
|
fbd9ecab62 | ||
|
|
b5c1fce136 | ||
|
|
489671dcca | ||
|
|
d4dc3466dc | ||
|
|
0439acacbe | ||
|
|
735fc2ac8e | ||
|
|
8a825f0055 | ||
|
|
d0ae8b7923 | ||
|
|
a504773941 | ||
|
|
feb8e6c76c | ||
|
|
a37a5038d8 | ||
|
|
f1933b786c | ||
|
|
d6a6ef2c1d | ||
|
|
cf9554b169 | ||
|
|
d602cf4646 | ||
|
|
dfcae4ee64 | ||
|
|
e3bcd8c9bf | ||
|
|
c4990fa3f9 | ||
|
|
98461d813e | ||
|
|
8ec17a4c83 | ||
|
|
ee708cc395 | ||
|
|
8a670c029a | ||
|
|
9fa5aec01e |
15
CONTRIBUTING.md
Normal file
15
CONTRIBUTING.md
Normal file
@@ -0,0 +1,15 @@
|
||||
Contributing is always welcome!
|
||||
|
||||
I am no professional flask developer, if you know a better way that something can be done, please let me know!
|
||||
|
||||
Otherwise, it's always best to PR into the `dev` branch.
|
||||
|
||||
Please be sure that all new functionality has a matching test!
|
||||
|
||||
Use `pytest` to validate/test, you can run the existing tests as `pytest tests/test_notifications.py` for example
|
||||
|
||||
```
|
||||
pip3 install -r requirements-dev
|
||||
```
|
||||
|
||||
this is from https://github.com/dgtlmoon/changedetection.io/blob/master/requirements-dev.txt
|
||||
72
README.md
72
README.md
@@ -7,36 +7,41 @@
|
||||
|
||||
_Know when web pages change! Stay ontop of new information!_
|
||||
|
||||
Live your data-life *pro-actively* instead of *re-actively*, do not rely on manipulative social media for consuming important information.
|
||||
Live your data-life *pro-actively* instead of *re-actively*.
|
||||
|
||||
Open source web page monitoring, notification and change detection.
|
||||
|
||||
|
||||
<img src="https://raw.githubusercontent.com/dgtlmoon/changedetection.io/master/screenshot.png" style="max-width:100%;" alt="Self-hosted web page change monitoring" title="Self-hosted web page change monitoring" />
|
||||
|
||||
[](https://dashboard.heroku.com/new?template=https%3A%2F%2Fgithub.com%2Fdgtlmoon%2Fchangedetection.io%2Ftree%2Fmaster)
|
||||
|
||||
**Get your own instance now on Lemonade!**
|
||||
|
||||
[](https://lemonade.changedetection.io/start)
|
||||
|
||||
- Automatic Updates, Automatic Backups, No Heroku "paused application", don't miss a change!
|
||||
- Javascript browser included
|
||||
- Pay with Bitcoin
|
||||
|
||||
#### Example use cases
|
||||
|
||||
Know when ...
|
||||
|
||||
- Government department updates (changes are often only on their websites)
|
||||
- Local government news (changes are often only on their websites)
|
||||
- Products and services have a change in pricing
|
||||
- Governmental department updates (changes are often only on their websites)
|
||||
- New software releases, security advisories when you're not on their mailing list.
|
||||
- Festivals with changes
|
||||
- Realestate listing changes
|
||||
- COVID related news from government websites
|
||||
- University/organisation news from their website
|
||||
- Detect and monitor changes in JSON API responses
|
||||
- API monitoring and alerting
|
||||
- Changes in legal and other documents
|
||||
- Trigger API calls via notifications when text appears on a website
|
||||
- Glue together API's using the JSON filter and JSON notifications
|
||||
- Glue together APIs using the JSON filter and JSON notifications
|
||||
- Create RSS feeds based on changes in web content
|
||||
- You have a very sensitive list of URLs to watch and you do _not_ want to use the paid alternatives. (Remember, _you_ are the product)
|
||||
|
||||
_Need an actual Chrome runner with Javascript support? We support fetching via WebDriver!</a>_
|
||||
|
||||
**Get monitoring now! super simple.**
|
||||
|
||||
<a href="https://dashboard.heroku.com/new?template=https%3A%2F%2Fgithub.com%2Fdgtlmoon%2Fchangedetection.io%2Ftree%2Fmaster">Deploy to Heroku for free</a>, Run this python directly, or with <a href="https://docs.docker.com/get-docker/">docker</a> and/or <a href="https://www.digitalocean.com/community/tutorial_collections/how-to-install-docker-compose">docker-compose</a>
|
||||
|
||||
## Screenshots
|
||||
|
||||
Examining differences in content.
|
||||
@@ -88,7 +93,13 @@ docker run -d --restart always -p "127.0.0.1:5000:5000" -v datastore-volume:/dat
|
||||
docker-compose pull && docker-compose up -d
|
||||
```
|
||||
|
||||
### Notifications
|
||||
See the wiki for more information https://github.com/dgtlmoon/changedetection.io/wiki
|
||||
|
||||
|
||||
## Filters
|
||||
XPath, JSONPath and CSS support comes baked in! You can be as specific as you need, use XPath exported from various XPath element query creation tools.
|
||||
|
||||
## Notifications
|
||||
|
||||
ChangeDetection.io supports a massive amount of notifications (including email, office365, custom APIs, etc) when a web-page has a change detected thanks to the <a href="https://github.com/caronc/apprise">apprise</a> library.
|
||||
Simply set one or more notification URL's in the _[edit]_ tab of that watch.
|
||||
@@ -112,7 +123,7 @@ Just some examples
|
||||
|
||||
Now you can also customise your notification content!
|
||||
|
||||
### JSON API Monitoring
|
||||
## JSON API Monitoring
|
||||
|
||||
Detect changes and monitor data in JSON API's by using the built-in JSONPath selectors as a filter / selector.
|
||||
|
||||
@@ -122,7 +133,7 @@ This will re-parse the JSON and apply formatting to the text, making it super ea
|
||||
|
||||

|
||||
|
||||
#### Parse JSON embedded in HTML!
|
||||
### Parse JSON embedded in HTML!
|
||||
|
||||
When you enable a `json:` filter, you can even automatically extract and parse embedded JSON inside a HTML page! Amazingly handy for sites that build content based on JSON, such as many e-commerce websites.
|
||||
|
||||
@@ -136,34 +147,19 @@ When you enable a `json:` filter, you can even automatically extract and parse e
|
||||
|
||||
`json:$.price` would give `23.50`, or you can extract the whole structure
|
||||
|
||||
### Proxy
|
||||
## Proxy configuration
|
||||
|
||||
A proxy for ChangeDetection.io can be configured by setting environment the
|
||||
`HTTP_PROXY`, `HTTPS_PROXY` variables, examples are also in the `docker-compose.yml`
|
||||
See the wiki https://github.com/dgtlmoon/changedetection.io/wiki/Proxy-configuration
|
||||
|
||||
`NO_PROXY` exclude list can be specified by following `"localhost,192.168.0.0/24"`
|
||||
## Raspberry Pi support?
|
||||
|
||||
as `docker run` with `-e`
|
||||
Raspberry Pi and linux/arm/v6 linux/arm/v7 arm64 devices are supported! See the wiki for [details](https://github.com/dgtlmoon/changedetection.io/wiki/Fetching-pages-with-WebDriver)
|
||||
|
||||
```
|
||||
docker run -d --restart always -e HTTPS_PROXY="socks5h://10.10.1.10:1080" -p "127.0.0.1:5000:5000" -v datastore-volume:/datastore --name changedetection.io dgtlmoon/changedetection.io
|
||||
```
|
||||
|
||||
With `docker-compose`, see the `Proxy support example` in <a href="https://github.com/dgtlmoon/changedetection.io/blob/master/docker-compose.yml">docker-compose.yml</a>.
|
||||
|
||||
For more information see https://docs.python-requests.org/en/master/user/advanced/#proxies
|
||||
|
||||
This proxy support also extends to the notifications https://github.com/caronc/apprise/issues/387#issuecomment-841718867
|
||||
|
||||
### RaspberriPi support?
|
||||
|
||||
RaspberriPi and linux/arm/v6 linux/arm/v7 arm64 devices are supported!
|
||||
|
||||
### Windows native support?
|
||||
## Windows native support?
|
||||
|
||||
Sorry not yet :( https://github.com/dgtlmoon/changedetection.io/labels/windows
|
||||
|
||||
### Support us
|
||||
## Support us
|
||||
|
||||
Do you use changedetection.io to make money? does it save you time or money? Does it make your life easier? less stressful? Remember, we write this software when we should be doing actual paid work, we have to buy food and pay rent just like you.
|
||||
|
||||
@@ -173,8 +169,12 @@ BTC `1PLFN327GyUarpJd7nVe7Reqg9qHx5frNn`
|
||||
|
||||
<img src="https://raw.githubusercontent.com/dgtlmoon/changedetection.io/master/btc-support.png" style="max-width:50%;" alt="Support us!" />
|
||||
|
||||
## Commercial Support
|
||||
|
||||
[release-shield]: https://img.shields.io/github/v/release/dgtlmoon/changedetection.io?style=for-the-badge
|
||||
I offer commercial support, this software is depended on by network security, aerospace , data-science and data-journalist professionals just to name a few, please reach out at dgtlmoon@gmail.com for any enquiries, I am more than glad to work with your organisation to further the possibilities of what can be done with changedetection.io
|
||||
|
||||
|
||||
[release-shield]: https://img.shields.io:/github/v/release/dgtlmoon/changedetection.io?style=for-the-badge
|
||||
[docker-pulls]: https://img.shields.io/docker/pulls/dgtlmoon/changedetection.io?style=for-the-badge
|
||||
[test-shield]: https://github.com/dgtlmoon/changedetection.io/actions/workflows/test-only.yml/badge.svg?branch=master
|
||||
|
||||
|
||||
@@ -14,6 +14,7 @@ from changedetectionio import store
|
||||
|
||||
def main():
|
||||
ssl_mode = False
|
||||
host = ''
|
||||
port = os.environ.get('PORT') or 5000
|
||||
do_cleanup = False
|
||||
|
||||
@@ -21,9 +22,9 @@ def main():
|
||||
datastore_path = os.path.join(os.getcwd(), "datastore")
|
||||
|
||||
try:
|
||||
opts, args = getopt.getopt(sys.argv[1:], "Ccsd:p:", "port")
|
||||
opts, args = getopt.getopt(sys.argv[1:], "Ccsd:h:p:", "port")
|
||||
except getopt.GetoptError:
|
||||
print('backend.py -s SSL enable -p [port] -d [datastore path]')
|
||||
print('backend.py -s SSL enable -h [host] -p [port] -d [datastore path]')
|
||||
sys.exit(2)
|
||||
|
||||
create_datastore_dir = False
|
||||
@@ -37,6 +38,9 @@ def main():
|
||||
if opt == '-s':
|
||||
ssl_mode = True
|
||||
|
||||
if opt == '-h':
|
||||
host = arg
|
||||
|
||||
if opt == '-p':
|
||||
port = int(arg)
|
||||
|
||||
@@ -59,7 +63,7 @@ def main():
|
||||
os.mkdir(app_config['datastore_path'])
|
||||
else:
|
||||
print ("ERROR: Directory path for the datastore '{}' does not exist, cannot start, please make sure the directory exists.\n"
|
||||
"Alternatively, use the -d parameter.".format(app_config['datastore_path']),file=sys.stderr)
|
||||
"Alternatively, use the -C parameter.".format(app_config['datastore_path']),file=sys.stderr)
|
||||
sys.exit(2)
|
||||
|
||||
datastore = store.ChangeDetectionStore(datastore_path=app_config['datastore_path'], version_tag=changedetectionio.__version__)
|
||||
@@ -93,13 +97,13 @@ def main():
|
||||
|
||||
if ssl_mode:
|
||||
# @todo finalise SSL config, but this should get you in the right direction if you need it.
|
||||
eventlet.wsgi.server(eventlet.wrap_ssl(eventlet.listen(('', port)),
|
||||
eventlet.wsgi.server(eventlet.wrap_ssl(eventlet.listen((host, port)),
|
||||
certfile='cert.pem',
|
||||
keyfile='privkey.pem',
|
||||
server_side=True), app)
|
||||
|
||||
else:
|
||||
eventlet.wsgi.server(eventlet.listen(('', int(port))), app)
|
||||
eventlet.wsgi.server(eventlet.listen((host, int(port))), app)
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
|
||||
@@ -11,26 +11,32 @@
|
||||
# proxy per check
|
||||
# - flask_cors, itsdangerous,MarkupSafe
|
||||
|
||||
import time
|
||||
import datetime
|
||||
import os
|
||||
import timeago
|
||||
import flask_login
|
||||
from flask_login import login_required
|
||||
|
||||
import queue
|
||||
import threading
|
||||
import time
|
||||
from copy import deepcopy
|
||||
from threading import Event
|
||||
|
||||
import queue
|
||||
|
||||
from flask import Flask, render_template, request, send_from_directory, abort, redirect, url_for, flash
|
||||
|
||||
from feedgen.feed import FeedGenerator
|
||||
from flask import make_response
|
||||
import datetime
|
||||
import flask_login
|
||||
import pytz
|
||||
from copy import deepcopy
|
||||
import timeago
|
||||
from feedgen.feed import FeedGenerator
|
||||
from flask import (
|
||||
Flask,
|
||||
abort,
|
||||
flash,
|
||||
make_response,
|
||||
redirect,
|
||||
render_template,
|
||||
request,
|
||||
send_from_directory,
|
||||
url_for,
|
||||
)
|
||||
from flask_login import login_required
|
||||
|
||||
__version__ = '0.39.5'
|
||||
__version__ = '0.39.8'
|
||||
|
||||
datastore = None
|
||||
|
||||
@@ -64,6 +70,7 @@ app.config['LOGIN_DISABLED'] = False
|
||||
# Disables caching of the templates
|
||||
app.config['TEMPLATES_AUTO_RELOAD'] = True
|
||||
|
||||
notification_debug_log=[]
|
||||
|
||||
def init_app_secret(datastore_path):
|
||||
secret = ""
|
||||
@@ -137,13 +144,21 @@ class User(flask_login.UserMixin):
|
||||
def get_id(self):
|
||||
return str(self.id)
|
||||
|
||||
# Compare given password against JSON store or Env var
|
||||
def check_password(self, password):
|
||||
|
||||
import hashlib
|
||||
import base64
|
||||
import hashlib
|
||||
|
||||
# Can be stored in env (for deployments) or in the general configs
|
||||
raw_salt_pass = os.getenv("SALTED_PASS", False)
|
||||
|
||||
if not raw_salt_pass:
|
||||
raw_salt_pass = datastore.data['settings']['application']['password']
|
||||
|
||||
raw_salt_pass = base64.b64decode(raw_salt_pass)
|
||||
|
||||
|
||||
# Getting the values back out
|
||||
raw_salt_pass = base64.b64decode(datastore.data['settings']['application']['password'])
|
||||
salt_from_storage = raw_salt_pass[:32] # 32 is the length of the salt
|
||||
|
||||
# Use the exact same setup you used to generate the key, but this time put in the password to check
|
||||
@@ -194,7 +209,7 @@ def changedetection_app(config=None, datastore_o=None):
|
||||
@app.route('/login', methods=['GET', 'POST'])
|
||||
def login():
|
||||
|
||||
if not datastore.data['settings']['application']['password']:
|
||||
if not datastore.data['settings']['application']['password'] and not os.getenv("SALTED_PASS", False):
|
||||
flash("Login not required, no password enabled.", "notice")
|
||||
return redirect(url_for('index'))
|
||||
|
||||
@@ -209,10 +224,18 @@ def changedetection_app(config=None, datastore_o=None):
|
||||
|
||||
if (user.check_password(password)):
|
||||
flask_login.login_user(user, remember=True)
|
||||
next = request.args.get('next')
|
||||
|
||||
# For now there's nothing else interesting here other than the index/list page
|
||||
# It's more reliable and safe to ignore the 'next' redirect
|
||||
# When we used...
|
||||
# next = request.args.get('next')
|
||||
# return redirect(next or url_for('index'))
|
||||
# We would sometimes get login loop errors on sites hosted in sub-paths
|
||||
|
||||
# note for the future:
|
||||
# if not is_safe_url(next):
|
||||
# return flask.abort(400)
|
||||
return redirect(next or url_for('index'))
|
||||
return redirect(url_for('index'))
|
||||
|
||||
else:
|
||||
flash('Incorrect password', 'error')
|
||||
@@ -221,8 +244,10 @@ def changedetection_app(config=None, datastore_o=None):
|
||||
|
||||
@app.before_request
|
||||
def do_something_whenever_a_request_comes_in():
|
||||
# Disable password loginif there is not one set
|
||||
app.config['LOGIN_DISABLED'] = datastore.data['settings']['application']['password'] == False
|
||||
|
||||
# Disable password login if there is not one set
|
||||
# (No password in settings or env var)
|
||||
app.config['LOGIN_DISABLED'] = datastore.data['settings']['application']['password'] == False and os.getenv("SALTED_PASS", False) == False
|
||||
|
||||
# For the RSS path, allow access via a token
|
||||
if request.path == '/rss' and request.args.get('token'):
|
||||
@@ -268,9 +293,23 @@ def changedetection_app(config=None, datastore_o=None):
|
||||
# @todo In the future make this a configurable link back (see work on BASE_URL https://github.com/dgtlmoon/changedetection.io/pull/228)
|
||||
guid = "{}/{}".format(watch['uuid'], watch['last_changed'])
|
||||
fe = fg.add_entry()
|
||||
fe.title(watch['url'])
|
||||
fe.link(href=watch['url'])
|
||||
fe.description(watch['url'])
|
||||
|
||||
|
||||
# Include a link to the diff page, they will have to login here to see if password protection is enabled.
|
||||
# Description is the page you watch, link takes you to the diff JS UI page
|
||||
base_url = datastore.data['settings']['application']['base_url']
|
||||
if base_url == '':
|
||||
base_url = "<base-url-env-var-not-set>"
|
||||
|
||||
diff_link = {'href': "{}{}".format(base_url, url_for('diff_history_page', uuid=watch['uuid']))}
|
||||
|
||||
# @todo use title if it exists
|
||||
fe.link(link=diff_link)
|
||||
fe.title(title=watch['url'])
|
||||
|
||||
# @todo in the future <description><![CDATA[<html><body>Any code html is valid.</body></html>]]></description>
|
||||
fe.description(description=watch['url'])
|
||||
|
||||
fe.guid(guid, permalink=False)
|
||||
dt = datetime.datetime.fromtimestamp(int(watch['newest_history_key']))
|
||||
dt = dt.replace(tzinfo=pytz.UTC)
|
||||
@@ -386,12 +425,13 @@ def changedetection_app(config=None, datastore_o=None):
|
||||
def get_current_checksum_include_ignore_text(uuid):
|
||||
|
||||
import hashlib
|
||||
|
||||
from changedetectionio import fetch_site_status
|
||||
|
||||
# Get the most recent one
|
||||
newest_history_key = datastore.get_val(uuid, 'newest_history_key')
|
||||
|
||||
# 0 means that theres only one, so that there should be no 'unviewed' history availabe
|
||||
# 0 means that theres only one, so that there should be no 'unviewed' history available
|
||||
if newest_history_key == 0:
|
||||
newest_history_key = list(datastore.data['watching'][uuid]['history'].keys())[0]
|
||||
|
||||
@@ -404,7 +444,11 @@ def changedetection_app(config=None, datastore_o=None):
|
||||
stripped_content = handler.strip_ignore_text(raw_content,
|
||||
datastore.data['watching'][uuid]['ignore_text'])
|
||||
|
||||
checksum = hashlib.md5(stripped_content).hexdigest()
|
||||
if datastore.data['settings']['application'].get('ignore_whitespace', False):
|
||||
checksum = hashlib.md5(stripped_content.translate(None, b'\r\n\t ')).hexdigest()
|
||||
else:
|
||||
checksum = hashlib.md5(stripped_content).hexdigest()
|
||||
|
||||
return checksum
|
||||
|
||||
return datastore.data['watching'][uuid]['previous_md5']
|
||||
@@ -445,6 +489,8 @@ def changedetection_app(config=None, datastore_o=None):
|
||||
'tag': form.tag.data.strip(),
|
||||
'title': form.title.data.strip(),
|
||||
'headers': form.headers.data,
|
||||
'body': form.body.data,
|
||||
'method': form.method.data,
|
||||
'fetch_backend': form.fetch_backend.data,
|
||||
'trigger_text': form.trigger_text.data,
|
||||
'notification_title': form.notification_title.data,
|
||||
@@ -492,6 +538,7 @@ def changedetection_app(config=None, datastore_o=None):
|
||||
'notification_title': form.notification_title.data,
|
||||
'notification_body': form.notification_body.data,
|
||||
'notification_format': form.notification_format.data,
|
||||
'uuid': uuid
|
||||
}
|
||||
notification_q.put(n_object)
|
||||
flash('Test notification queued.')
|
||||
@@ -528,14 +575,15 @@ def changedetection_app(config=None, datastore_o=None):
|
||||
@login_required
|
||||
def settings_page():
|
||||
|
||||
from changedetectionio import forms
|
||||
from changedetectionio import content_fetcher
|
||||
from changedetectionio import content_fetcher, forms
|
||||
|
||||
form = forms.globalSettingsForm(request.form)
|
||||
|
||||
if request.method == 'GET':
|
||||
form.minutes_between_check.data = int(datastore.data['settings']['requests']['minutes_between_check'])
|
||||
form.notification_urls.data = datastore.data['settings']['application']['notification_urls']
|
||||
form.global_ignore_text.data = datastore.data['settings']['application']['global_ignore_text']
|
||||
form.ignore_whitespace.data = datastore.data['settings']['application']['ignore_whitespace']
|
||||
form.extract_title_as_title.data = datastore.data['settings']['application']['extract_title_as_title']
|
||||
form.fetch_backend.data = datastore.data['settings']['application']['fetch_backend']
|
||||
form.notification_title.data = datastore.data['settings']['application']['notification_title']
|
||||
@@ -543,8 +591,8 @@ def changedetection_app(config=None, datastore_o=None):
|
||||
form.notification_format.data = datastore.data['settings']['application']['notification_format']
|
||||
form.base_url.data = datastore.data['settings']['application']['base_url']
|
||||
|
||||
# Password unset is a GET
|
||||
if request.values.get('removepassword') == 'yes':
|
||||
# Password unset is a GET, but we can lock the session to always need the password
|
||||
if not os.getenv("SALTED_PASS", False) and request.values.get('removepassword') == 'yes':
|
||||
from pathlib import Path
|
||||
datastore.data['settings']['application']['password'] = False
|
||||
flash("Password protection removed.", 'notice')
|
||||
@@ -562,6 +610,8 @@ def changedetection_app(config=None, datastore_o=None):
|
||||
datastore.data['settings']['application']['notification_format'] = form.notification_format.data
|
||||
datastore.data['settings']['application']['notification_urls'] = form.notification_urls.data
|
||||
datastore.data['settings']['application']['base_url'] = form.base_url.data
|
||||
datastore.data['settings']['application']['global_ignore_text'] = form.global_ignore_text.data
|
||||
datastore.data['settings']['application']['ignore_whitespace'] = form.ignore_whitespace.data
|
||||
|
||||
if form.trigger_check.data:
|
||||
if len(form.notification_urls.data):
|
||||
@@ -576,7 +626,7 @@ def changedetection_app(config=None, datastore_o=None):
|
||||
else:
|
||||
flash('No notification URLs set, cannot send test.', 'error')
|
||||
|
||||
if form.password.encrypted_password:
|
||||
if not os.getenv("SALTED_PASS", False) and form.password.encrypted_password:
|
||||
datastore.data['settings']['application']['password'] = form.password.encrypted_password
|
||||
flash("Password protection enabled.", 'notice')
|
||||
flask_login.logout_user()
|
||||
@@ -588,7 +638,10 @@ def changedetection_app(config=None, datastore_o=None):
|
||||
if request.method == 'POST' and not form.validate():
|
||||
flash("An error occurred, please see below.", "error")
|
||||
|
||||
output = render_template("settings.html", form=form, current_base_url = datastore.data['settings']['application']['base_url'])
|
||||
output = render_template("settings.html",
|
||||
form=form,
|
||||
current_base_url = datastore.data['settings']['application']['base_url'],
|
||||
hide_remove_pass=os.getenv("SALTED_PASS", False))
|
||||
|
||||
return output
|
||||
|
||||
@@ -604,8 +657,10 @@ def changedetection_app(config=None, datastore_o=None):
|
||||
urls = request.values.get('urls').split("\n")
|
||||
for url in urls:
|
||||
url = url.strip()
|
||||
url, *tags = url.split(" ")
|
||||
# Flask wtform validators wont work with basic auth, use validators package
|
||||
if len(url) and validators.url(url):
|
||||
new_uuid = datastore.add_watch(url=url.strip(), tag="")
|
||||
new_uuid = datastore.add_watch(url=url.strip(), tag=" ".join(tags))
|
||||
# Straight into the queue.
|
||||
update_q.put(new_uuid)
|
||||
good += 1
|
||||
@@ -640,6 +695,10 @@ def changedetection_app(config=None, datastore_o=None):
|
||||
@app.route("/diff/<string:uuid>", methods=['GET'])
|
||||
@login_required
|
||||
def diff_history_page(uuid):
|
||||
from changedetectionio import content_fetcher
|
||||
|
||||
newest_version_file_contents = ""
|
||||
previous_version_file_contents = ""
|
||||
|
||||
# More for testing, possible to return the first/only
|
||||
if uuid == 'first':
|
||||
@@ -665,21 +724,28 @@ def changedetection_app(config=None, datastore_o=None):
|
||||
|
||||
# Save the current newest history as the most recently viewed
|
||||
datastore.set_last_viewed(uuid, dates[0])
|
||||
newest_file = watch['history'][dates[0]]
|
||||
with open(newest_file, 'r') as f:
|
||||
newest_version_file_contents = f.read()
|
||||
|
||||
previous_version = request.args.get('previous_version')
|
||||
try:
|
||||
previous_file = watch['history'][previous_version]
|
||||
except KeyError:
|
||||
# Not present, use a default value, the second one in the sorted list.
|
||||
previous_file = watch['history'][dates[1]]
|
||||
if ('content-type' in watch and content_fetcher.supported_binary_type(watch['content-type'])):
|
||||
template = "diff-image.html"
|
||||
else:
|
||||
newest_file = watch['history'][dates[0]]
|
||||
with open(newest_file, 'r') as f:
|
||||
newest_version_file_contents = f.read()
|
||||
|
||||
with open(previous_file, 'r') as f:
|
||||
previous_version_file_contents = f.read()
|
||||
try:
|
||||
previous_file = watch['history'][previous_version]
|
||||
except KeyError:
|
||||
# Not present, use a default value, the second one in the sorted list.
|
||||
previous_file = watch['history'][dates[1]]
|
||||
|
||||
output = render_template("diff.html", watch_a=watch,
|
||||
with open(previous_file, 'r') as f:
|
||||
previous_version_file_contents = f.read()
|
||||
|
||||
template = "diff.html"
|
||||
|
||||
output = render_template(template,
|
||||
watch_a=watch,
|
||||
newest=newest_version_file_contents,
|
||||
previous=previous_version_file_contents,
|
||||
extra_stylesheets=extra_stylesheets,
|
||||
@@ -696,6 +762,7 @@ def changedetection_app(config=None, datastore_o=None):
|
||||
@app.route("/preview/<string:uuid>", methods=['GET'])
|
||||
@login_required
|
||||
def preview_page(uuid):
|
||||
from changedetectionio import content_fetcher
|
||||
|
||||
# More for testing, possible to return the first/only
|
||||
if uuid == 'first':
|
||||
@@ -710,16 +777,99 @@ def changedetection_app(config=None, datastore_o=None):
|
||||
return redirect(url_for('index'))
|
||||
|
||||
newest = list(watch['history'].keys())[-1]
|
||||
with open(watch['history'][newest], 'r') as f:
|
||||
content = f.readlines()
|
||||
fname = watch['history'][newest]
|
||||
|
||||
if ('content-type' in watch and content_fetcher.supported_binary_type(watch['content-type'])):
|
||||
template = "preview-image.html"
|
||||
content = fname
|
||||
else:
|
||||
template = "preview.html"
|
||||
try:
|
||||
with open(fname, 'r') as f:
|
||||
content = f.read()
|
||||
except:
|
||||
content = "Cant read {}".format(fname)
|
||||
|
||||
output = render_template("preview.html",
|
||||
content=content,
|
||||
extra_stylesheets=extra_stylesheets,
|
||||
current_diff_url=watch['url'],
|
||||
uuid=uuid)
|
||||
uuid=uuid,
|
||||
watch=watch)
|
||||
return output
|
||||
|
||||
@app.route("/settings/notification-logs", methods=['GET'])
|
||||
@login_required
|
||||
def notification_logs():
|
||||
global notification_debug_log
|
||||
output = render_template("notification-log.html",
|
||||
logs=notification_debug_log if len(notification_debug_log) else ["No errors or warnings detected"])
|
||||
|
||||
return output
|
||||
|
||||
|
||||
# render an image which contains the diff of two images
|
||||
# We always compare the newest against whatever compare_date we are given
|
||||
@app.route("/diff/show-image/<string:uuid>/<string:datestr>")
|
||||
def show_single_image(uuid, datestr):
|
||||
|
||||
from flask import make_response
|
||||
watch = datastore.data['watching'][uuid]
|
||||
|
||||
if datestr == 'None' or datestr is None:
|
||||
datestr = list(watch['history'].keys())[0]
|
||||
|
||||
fname = watch['history'][datestr]
|
||||
with open(fname, 'rb') as f:
|
||||
resp = make_response(f.read())
|
||||
|
||||
# @todo assumption here about the type, re-encode? detect?
|
||||
resp.headers['Content-Type'] = 'image/jpeg'
|
||||
return resp
|
||||
|
||||
# render an image which contains the diff of two images
|
||||
# We always compare the newest against whatever compare_date we are given
|
||||
@app.route("/diff/image/<string:uuid>/<string:compare_date>")
|
||||
def render_diff_image(uuid, compare_date):
|
||||
from changedetectionio import image_diff
|
||||
|
||||
from flask import make_response
|
||||
watch = datastore.data['watching'][uuid]
|
||||
newest = list(watch['history'].keys())[-1]
|
||||
|
||||
# @todo this is weird
|
||||
if compare_date == 'None' or compare_date is None:
|
||||
compare_date = list(watch['history'].keys())[0]
|
||||
|
||||
new_img = watch['history'][newest]
|
||||
prev_img = watch['history'][compare_date]
|
||||
img = image_diff.render_diff(new_img, prev_img)
|
||||
|
||||
resp = make_response(img)
|
||||
resp.headers['Content-Type'] = 'image/jpeg'
|
||||
return resp
|
||||
|
||||
|
||||
@app.route("/api/<string:uuid>/snapshot/current", methods=['GET'])
|
||||
@login_required
|
||||
def api_snapshot(uuid):
|
||||
|
||||
# More for testing, possible to return the first/only
|
||||
if uuid == 'first':
|
||||
uuid = list(datastore.data['watching'].keys()).pop()
|
||||
|
||||
try:
|
||||
watch = datastore.data['watching'][uuid]
|
||||
except KeyError:
|
||||
return abort(400, "No history found for the specified link, bad link?")
|
||||
|
||||
newest = list(watch['history'].keys())[-1]
|
||||
with open(watch['history'][newest], 'r') as f:
|
||||
content = f.read()
|
||||
|
||||
resp = make_response(content)
|
||||
resp.headers['Content-Type'] = 'text/plain'
|
||||
return resp
|
||||
|
||||
@app.route("/favicon.ico", methods=['GET'])
|
||||
def favicon():
|
||||
@@ -734,7 +884,8 @@ def changedetection_app(config=None, datastore_o=None):
|
||||
from pathlib import Path
|
||||
|
||||
# Remove any existing backup file, for now we just keep one file
|
||||
for previous_backup_filename in Path(app.config['datastore_path']).rglob('changedetection-backup-*.zip'):
|
||||
|
||||
for previous_backup_filename in Path(datastore_o.datastore_path).rglob('changedetection-backup-*.zip'):
|
||||
os.unlink(previous_backup_filename)
|
||||
|
||||
# create a ZipFile object
|
||||
@@ -742,7 +893,7 @@ def changedetection_app(config=None, datastore_o=None):
|
||||
|
||||
# We only care about UUIDS from the current index file
|
||||
uuids = list(datastore.data['watching'].keys())
|
||||
backup_filepath = os.path.join(app.config['datastore_path'], backupname)
|
||||
backup_filepath = os.path.join(datastore_o.datastore_path, backupname)
|
||||
|
||||
with zipfile.ZipFile(backup_filepath, "w",
|
||||
compression=zipfile.ZIP_DEFLATED,
|
||||
@@ -752,34 +903,51 @@ def changedetection_app(config=None, datastore_o=None):
|
||||
datastore.sync_to_json()
|
||||
|
||||
# Add the index
|
||||
zipObj.write(os.path.join(app.config['datastore_path'], "url-watches.json"), arcname="url-watches.json")
|
||||
zipObj.write(os.path.join(datastore_o.datastore_path, "url-watches.json"), arcname="url-watches.json")
|
||||
|
||||
# Add the flask app secret
|
||||
zipObj.write(os.path.join(app.config['datastore_path'], "secret.txt"), arcname="secret.txt")
|
||||
zipObj.write(os.path.join(datastore_o.datastore_path, "secret.txt"), arcname="secret.txt")
|
||||
|
||||
# Add any snapshot data we find, use the full path to access the file, but make the file 'relative' in the Zip.
|
||||
for txt_file_path in Path(app.config['datastore_path']).rglob('*.txt'):
|
||||
for txt_file_path in Path(datastore_o.datastore_path).rglob('*.txt'):
|
||||
parent_p = txt_file_path.parent
|
||||
if parent_p.name in uuids:
|
||||
zipObj.write(txt_file_path,
|
||||
arcname=str(txt_file_path).replace(app.config['datastore_path'], ''),
|
||||
arcname=str(txt_file_path).replace(datastore_o.datastore_path, ''),
|
||||
compress_type=zipfile.ZIP_DEFLATED,
|
||||
compresslevel=8)
|
||||
|
||||
# Create a list file with just the URLs, so it's easier to port somewhere else in the future
|
||||
list_file = os.path.join(app.config['datastore_path'], "url-list.txt")
|
||||
with open(list_file, "w") as f:
|
||||
for uuid in datastore.data['watching']:
|
||||
url = datastore.data['watching'][uuid]['url']
|
||||
list_file = "url-list.txt"
|
||||
with open(os.path.join(datastore_o.datastore_path, list_file), "w") as f:
|
||||
for uuid in datastore.data["watching"]:
|
||||
url = datastore.data["watching"][uuid]["url"]
|
||||
f.write("{}\r\n".format(url))
|
||||
list_with_tags_file = "url-list-with-tags.txt"
|
||||
with open(
|
||||
os.path.join(datastore_o.datastore_path, list_with_tags_file), "w"
|
||||
) as f:
|
||||
for uuid in datastore.data["watching"]:
|
||||
url = datastore.data["watching"][uuid]["url"]
|
||||
tag = datastore.data["watching"][uuid]["tag"]
|
||||
f.write("{} {}\r\n".format(url, tag))
|
||||
|
||||
# Add it to the Zip
|
||||
zipObj.write(list_file,
|
||||
arcname="url-list.txt",
|
||||
compress_type=zipfile.ZIP_DEFLATED,
|
||||
compresslevel=8)
|
||||
zipObj.write(
|
||||
os.path.join(datastore_o.datastore_path, list_file),
|
||||
arcname=list_file,
|
||||
compress_type=zipfile.ZIP_DEFLATED,
|
||||
compresslevel=8,
|
||||
)
|
||||
zipObj.write(
|
||||
os.path.join(datastore_o.datastore_path, list_with_tags_file),
|
||||
arcname=list_with_tags_file,
|
||||
compress_type=zipfile.ZIP_DEFLATED,
|
||||
compresslevel=8,
|
||||
)
|
||||
|
||||
return send_from_directory(app.config['datastore_path'], backupname, as_attachment=True)
|
||||
# Send_from_directory needs to be the full absolute path
|
||||
return send_from_directory(os.path.abspath(datastore_o.datastore_path), backupname, as_attachment=True)
|
||||
|
||||
@app.route("/static/<string:group>/<string:filename>", methods=['GET'])
|
||||
def static_content(group, filename):
|
||||
@@ -816,7 +984,6 @@ def changedetection_app(config=None, datastore_o=None):
|
||||
@app.route("/api/delete", methods=['GET'])
|
||||
@login_required
|
||||
def api_delete():
|
||||
|
||||
uuid = request.args.get('uuid')
|
||||
datastore.delete(uuid)
|
||||
flash('Deleted.')
|
||||
@@ -871,7 +1038,7 @@ def changedetection_app(config=None, datastore_o=None):
|
||||
if watch_uuid not in running_uuids and not datastore.data['watching'][watch_uuid]['paused']:
|
||||
update_q.put(watch_uuid)
|
||||
i += 1
|
||||
flash("{} watches are rechecking.".format(i))
|
||||
flash("{} watches are queued for rechecking.".format(i))
|
||||
return redirect(url_for('index', tag=tag))
|
||||
|
||||
# @todo handle ctrl break
|
||||
@@ -889,7 +1056,6 @@ def changedetection_app(config=None, datastore_o=None):
|
||||
# Check for new version and anonymous stats
|
||||
def check_for_new_version():
|
||||
import requests
|
||||
|
||||
import urllib3
|
||||
urllib3.disable_warnings(urllib3.exceptions.InsecureRequestWarning)
|
||||
|
||||
@@ -915,6 +1081,7 @@ def check_for_new_version():
|
||||
app.config.exit.wait(86400)
|
||||
|
||||
def notification_runner():
|
||||
global notification_debug_log
|
||||
while not app.config.exit.is_set():
|
||||
try:
|
||||
# At the moment only one thread runs (single runner)
|
||||
@@ -929,14 +1096,30 @@ def notification_runner():
|
||||
notification.process_notification(n_object, datastore)
|
||||
|
||||
except Exception as e:
|
||||
print("Watch URL: {} Error {}".format(n_object['watch_url'], e))
|
||||
print("Watch URL: {} Error {}".format(n_object['watch_url'], str(e)))
|
||||
|
||||
# UUID wont be present when we submit a 'test' from the global settings
|
||||
if 'uuid' in n_object:
|
||||
datastore.update_watch(uuid=n_object['uuid'],
|
||||
update_obj={'last_notification_error': "Notification error detected, please see logs."})
|
||||
|
||||
log_lines = str(e).splitlines()
|
||||
notification_debug_log += log_lines
|
||||
|
||||
# Trim the log length
|
||||
notification_debug_log = notification_debug_log[-100:]
|
||||
|
||||
|
||||
|
||||
|
||||
# Thread runner to check every minute, look for new watches to feed into the Queue.
|
||||
def ticker_thread_check_time_launch_checks():
|
||||
from changedetectionio import update_worker
|
||||
|
||||
# Spin up Workers.
|
||||
for _ in range(datastore.data['settings']['requests']['workers']):
|
||||
# Spin up Workers that do the fetching
|
||||
# Can be overriden by ENV or use the default settings
|
||||
n_workers = int(os.getenv("FETCH_WORKERS", datastore.data['settings']['requests']['workers']))
|
||||
for _ in range(n_workers):
|
||||
new_worker = update_worker.update_worker(update_q, notification_q, app, datastore)
|
||||
running_update_threads.append(new_worker)
|
||||
new_worker.start()
|
||||
|
||||
@@ -3,17 +3,26 @@ import time
|
||||
from abc import ABC, abstractmethod
|
||||
from selenium import webdriver
|
||||
from selenium.webdriver.common.desired_capabilities import DesiredCapabilities
|
||||
from selenium.webdriver.common.proxy import Proxy as SeleniumProxy
|
||||
from selenium.common.exceptions import WebDriverException
|
||||
import urllib3.exceptions
|
||||
|
||||
# image/jpeg etc
|
||||
supported_binary_types = ['image']
|
||||
|
||||
class EmptyReply(Exception):
|
||||
def __init__(self, status_code, url):
|
||||
# Set this so we can use it in other parts of the app
|
||||
self.status_code = status_code
|
||||
self.url = url
|
||||
return
|
||||
|
||||
pass
|
||||
|
||||
class Fetcher():
|
||||
error = None
|
||||
status_code = None
|
||||
content = None # Should be bytes?
|
||||
content = None # Should always be bytes.
|
||||
headers = None
|
||||
|
||||
fetcher_description ="No description"
|
||||
|
||||
@@ -22,7 +31,7 @@ class Fetcher():
|
||||
return self.error
|
||||
|
||||
@abstractmethod
|
||||
def run(self, url, timeout, request_headers):
|
||||
def run(self, url, timeout, request_headers, request_body, request_method):
|
||||
# Should set self.error, self.status_code and self.content
|
||||
pass
|
||||
|
||||
@@ -43,6 +52,15 @@ class Fetcher():
|
||||
# def return_diff(self, stream_a, stream_b):
|
||||
# return
|
||||
|
||||
# Assume we dont support it as binary if its not in our list
|
||||
def supported_binary_type(content_type):
|
||||
# Not a binary thing we support? then use text (also used for JSON/XML etc)
|
||||
# @todo - future - use regex for matching
|
||||
if content_type and content_type.lower().strip().split('/')[0] not in (string.lower() for string in supported_binary_types):
|
||||
return False
|
||||
|
||||
return True
|
||||
|
||||
def available_fetchers():
|
||||
import inspect
|
||||
from changedetectionio import content_fetcher
|
||||
@@ -65,15 +83,39 @@ class html_webdriver(Fetcher):
|
||||
|
||||
command_executor = ''
|
||||
|
||||
def __init__(self):
|
||||
self.command_executor = os.getenv("WEBDRIVER_URL", 'http://browser-chrome:4444/wd/hub')
|
||||
# Configs for Proxy setup
|
||||
# In the ENV vars, is prefixed with "webdriver_", so it is for example "webdriver_sslProxy"
|
||||
selenium_proxy_settings_mappings = ['proxyType', 'ftpProxy', 'httpProxy', 'noProxy',
|
||||
'proxyAutoconfigUrl', 'sslProxy', 'autodetect',
|
||||
'socksProxy', 'socksVersion', 'socksUsername', 'socksPassword']
|
||||
|
||||
def run(self, url, timeout, request_headers):
|
||||
|
||||
|
||||
proxy=None
|
||||
|
||||
def __init__(self):
|
||||
# .strip('"') is going to save someone a lot of time when they accidently wrap the env value
|
||||
self.command_executor = os.getenv("WEBDRIVER_URL", 'http://browser-chrome:4444/wd/hub').strip('"')
|
||||
|
||||
# If any proxy settings are enabled, then we should setup the proxy object
|
||||
proxy_args = {}
|
||||
for k in self.selenium_proxy_settings_mappings:
|
||||
v = os.getenv('webdriver_' + k, False)
|
||||
if v:
|
||||
proxy_args[k] = v.strip('"')
|
||||
|
||||
if proxy_args:
|
||||
self.proxy = SeleniumProxy(raw=proxy_args)
|
||||
|
||||
def run(self, url, timeout, request_headers, request_body, request_method):
|
||||
|
||||
# request_body, request_method unused for now, until some magic in the future happens.
|
||||
|
||||
# check env for WEBDRIVER_URL
|
||||
driver = webdriver.Remote(
|
||||
command_executor=self.command_executor,
|
||||
desired_capabilities=DesiredCapabilities.CHROME)
|
||||
desired_capabilities=DesiredCapabilities.CHROME,
|
||||
proxy=self.proxy)
|
||||
|
||||
try:
|
||||
driver.get(url)
|
||||
@@ -84,10 +126,13 @@ class html_webdriver(Fetcher):
|
||||
|
||||
# @todo - how to check this? is it possible?
|
||||
self.status_code = 200
|
||||
# @todo somehow we should try to get this working for WebDriver
|
||||
# raise EmptyReply(url=url, status_code=r.status_code)
|
||||
|
||||
# @todo - dom wait loaded?
|
||||
time.sleep(5)
|
||||
time.sleep(int(os.getenv("WEBDRIVER_DELAY_BEFORE_CONTENT_READY", 5)))
|
||||
self.content = driver.page_source
|
||||
self.headers = {}
|
||||
|
||||
driver.quit()
|
||||
|
||||
@@ -104,28 +149,35 @@ class html_webdriver(Fetcher):
|
||||
# driver.quit() seems to cause better exceptions
|
||||
driver.quit()
|
||||
|
||||
|
||||
return True
|
||||
|
||||
# "html_requests" is listed as the default fetcher in store.py!
|
||||
class html_requests(Fetcher):
|
||||
fetcher_description = "Basic fast Plaintext/HTTP Client"
|
||||
|
||||
def run(self, url, timeout, request_headers):
|
||||
def run(self, url, timeout, request_headers, request_body, request_method):
|
||||
import requests
|
||||
|
||||
r = requests.get(url,
|
||||
r = requests.request(method=request_method,
|
||||
data=request_body,
|
||||
url=url,
|
||||
headers=request_headers,
|
||||
timeout=timeout,
|
||||
verify=False)
|
||||
|
||||
html = r.text
|
||||
# https://stackoverflow.com/questions/44203397/python-requests-get-returns-improperly-decoded-text-instead-of-utf-8
|
||||
|
||||
if not supported_binary_type(r.headers.get('Content-Type', '')):
|
||||
content = r.text
|
||||
else:
|
||||
content = r.content
|
||||
|
||||
# @todo test this
|
||||
if not r or not html or not len(html):
|
||||
raise EmptyReply(url)
|
||||
# @todo maybe you really want to test zero-byte return pages?
|
||||
if not r or not content or not len(content):
|
||||
raise EmptyReply(url=url, status_code=r.status_code)
|
||||
|
||||
self.status_code = r.status_code
|
||||
self.content = html
|
||||
self.content = content
|
||||
self.headers = r.headers
|
||||
|
||||
|
||||
@@ -55,13 +55,14 @@ class perform_site_check():
|
||||
|
||||
changed_detected = False
|
||||
stripped_text_from_html = ""
|
||||
fetched_md5 = ""
|
||||
|
||||
original_content_before_filters = False
|
||||
|
||||
watch = self.datastore.data['watching'][uuid]
|
||||
|
||||
update_obj = {'previous_md5': self.datastore.data['watching'][uuid]['previous_md5'],
|
||||
'history': {},
|
||||
"last_checked": timestamp
|
||||
}
|
||||
# Unset any existing notification error
|
||||
update_obj = {'last_notification_error': False, 'last_error': False}
|
||||
|
||||
extra_headers = self.datastore.get_val(uuid, 'headers')
|
||||
|
||||
@@ -80,6 +81,8 @@ class perform_site_check():
|
||||
else:
|
||||
timeout = self.datastore.data['settings']['requests']['timeout']
|
||||
url = self.datastore.get_val(uuid, 'url')
|
||||
request_body = self.datastore.get_val(uuid, 'body')
|
||||
request_method = self.datastore.get_val(uuid, 'method')
|
||||
|
||||
# Pluggable content fetcher
|
||||
prefer_backend = watch['fetch_backend']
|
||||
@@ -91,7 +94,8 @@ class perform_site_check():
|
||||
|
||||
|
||||
fetcher = klass()
|
||||
fetcher.run(url, timeout, request_headers)
|
||||
fetcher.run(url, timeout, request_headers, request_body, request_method)
|
||||
|
||||
# Fetching complete, now filters
|
||||
# @todo move to class / maybe inside of fetcher abstract base?
|
||||
|
||||
@@ -101,79 +105,123 @@ class perform_site_check():
|
||||
# - Do we convert to JSON?
|
||||
# https://stackoverflow.com/questions/41817578/basic-method-chaining ?
|
||||
# return content().textfilter().jsonextract().checksumcompare() ?
|
||||
|
||||
is_html = True
|
||||
update_obj['content-type'] = fetcher.headers.get('Content-Type', '').lower().strip()
|
||||
|
||||
# Could be 'application/json; charset=utf-8' etc
|
||||
is_json = 'application/json' in update_obj['content-type']
|
||||
is_text_or_html = 'text/' in update_obj['content-type'] # text/plain , text/html etc
|
||||
is_binary = not is_text_or_html and content_fetcher.supported_binary_type(update_obj['content-type'])
|
||||
css_filter_rule = watch['css_filter']
|
||||
if css_filter_rule and len(css_filter_rule.strip()):
|
||||
if 'json:' in css_filter_rule:
|
||||
stripped_text_from_html = html_tools.extract_json_as_string(content=fetcher.content, jsonpath_filter=css_filter_rule)
|
||||
is_html = False
|
||||
else:
|
||||
# CSS Filter, extract the HTML that matches and feed that into the existing inscriptis::get_text
|
||||
stripped_text_from_html = html_tools.css_filter(css_filter=css_filter_rule, html_content=fetcher.content)
|
||||
has_filter_rule = css_filter_rule and len(css_filter_rule.strip())
|
||||
|
||||
if is_html:
|
||||
# Auto-detect application/json, make it reformat the JSON to something nice
|
||||
if is_json and not has_filter_rule:
|
||||
css_filter_rule = "json:$"
|
||||
has_filter_rule = True
|
||||
|
||||
##### CONVERT THE INPUT TO TEXT, EXTRACT THE PARTS THAT NEED TO BE FILTERED
|
||||
|
||||
# Dont depend on the content-type header here, maybe it's not present
|
||||
if 'json:' in css_filter_rule:
|
||||
is_json = True
|
||||
rule = css_filter_rule.replace('json:', '')
|
||||
stripped_text_from_html = html_tools.extract_json_as_string(content=fetcher.content,
|
||||
jsonpath_filter=rule).encode('utf-8')
|
||||
is_text_or_html = False
|
||||
original_content_before_filters = stripped_text_from_html
|
||||
|
||||
if is_text_or_html:
|
||||
# CSS Filter, extract the HTML that matches and feed that into the existing inscriptis::get_text
|
||||
html_content = fetcher.content
|
||||
if css_filter_rule and len(css_filter_rule.strip()):
|
||||
html_content = html_tools.css_filter(css_filter=css_filter_rule, html_content=fetcher.content)
|
||||
if 'text/plain' in update_obj['content-type']:
|
||||
stripped_text_from_html = html_content
|
||||
|
||||
# Assume it's HTML if it's not text/plain
|
||||
if not 'text/plain' in update_obj['content-type']:
|
||||
if has_filter_rule:
|
||||
# For HTML/XML we offer xpath as an option, just start a regular xPath "/.."
|
||||
if css_filter_rule[0] == '/':
|
||||
html_content = html_tools.xpath_filter(xpath_filter=css_filter_rule, html_content=fetcher.content)
|
||||
else:
|
||||
# CSS Filter, extract the HTML that matches and feed that into the existing inscriptis::get_text
|
||||
html_content = html_tools.css_filter(css_filter=css_filter_rule, html_content=fetcher.content)
|
||||
# get_text() via inscriptis
|
||||
stripped_text_from_html = get_text(html_content)
|
||||
|
||||
# Extract title as title
|
||||
if self.datastore.data['settings']['application']['extract_title_as_title'] or watch['extract_title_as_title']:
|
||||
if not watch['title'] or not len(watch['title']):
|
||||
update_obj['title'] = html_tools.extract_element(find='title', html_content=fetcher.content)
|
||||
|
||||
# Re #340 - return the content before the 'ignore text' was applied
|
||||
original_content_before_filters = stripped_text_from_html.encode('utf-8')
|
||||
|
||||
# get_text() via inscriptis
|
||||
stripped_text_from_html = get_text(html_content)
|
||||
|
||||
# We rely on the actual text in the html output.. many sites have random script vars etc,
|
||||
# in the future we'll implement other mechanisms.
|
||||
|
||||
update_obj["last_check_status"] = fetcher.get_last_status_code()
|
||||
update_obj["last_error"] = False
|
||||
|
||||
######## AFTER FILTERING, STRIP OUT IGNORE TEXT
|
||||
if is_text_or_html:
|
||||
text_to_ignore = watch.get('ignore_text', []) + self.datastore.data['settings']['application'].get('global_ignore_text', [])
|
||||
if len(text_to_ignore):
|
||||
stripped_text_from_html = self.strip_ignore_text(stripped_text_from_html, text_to_ignore)
|
||||
else:
|
||||
stripped_text_from_html = stripped_text_from_html.encode('utf8')
|
||||
|
||||
|
||||
# If there's text to skip
|
||||
# @todo we could abstract out the get_text() to handle this cleaner
|
||||
if len(watch['ignore_text']):
|
||||
stripped_text_from_html = self.strip_ignore_text(stripped_text_from_html, watch['ignore_text'])
|
||||
else:
|
||||
stripped_text_from_html = stripped_text_from_html.encode('utf8')
|
||||
######## CALCULATE CHECKSUM FOR DIFF DETECTION
|
||||
# Re #133 - if we should strip whitespaces from triggering the change detected comparison
|
||||
if is_text_or_html:
|
||||
if self.datastore.data['settings']['application'].get('ignore_whitespace', False):
|
||||
fetched_md5 = hashlib.md5(stripped_text_from_html.translate(None, b'\r\n\t ')).hexdigest()
|
||||
else:
|
||||
fetched_md5 = hashlib.md5(stripped_text_from_html).hexdigest()
|
||||
|
||||
if is_json:
|
||||
fetched_md5 = hashlib.md5(stripped_text_from_html).hexdigest()
|
||||
|
||||
fetched_md5 = hashlib.md5(stripped_text_from_html).hexdigest()
|
||||
# Goal here in the future is to be able to abstract out different content type checks into their own class
|
||||
|
||||
if is_binary:
|
||||
# @todo - use some actual image hash here where possible, audio hash, etc etc
|
||||
m = hashlib.sha256()
|
||||
m.update(fetcher.content)
|
||||
fetched_md5 = m.hexdigest()
|
||||
original_content_before_filters = fetcher.content
|
||||
|
||||
# On the first run of a site, watch['previous_md5'] will be an empty string, set it the current one.
|
||||
if not len(watch['previous_md5']):
|
||||
watch['previous_md5'] = fetched_md5
|
||||
update_obj["previous_md5"] = fetched_md5
|
||||
|
||||
blocked_by_not_found_trigger_text = False
|
||||
|
||||
if len(watch['trigger_text']):
|
||||
blocked_by_not_found_trigger_text = True
|
||||
for line in watch['trigger_text']:
|
||||
# Because JSON wont serialize a re.compile object
|
||||
if line[0] == '/' and line[-1] == '/':
|
||||
regex = re.compile(line.strip('/'), re.IGNORECASE)
|
||||
# Found it? so we don't wait for it anymore
|
||||
r = re.search(regex, str(stripped_text_from_html))
|
||||
if r:
|
||||
# Trigger text can apply to JSON parsed documents too
|
||||
if is_text_or_html or is_json:
|
||||
if len(watch['trigger_text']):
|
||||
blocked_by_not_found_trigger_text = True
|
||||
for line in watch['trigger_text']:
|
||||
# Because JSON wont serialize a re.compile object
|
||||
if line[0] == '/' and line[-1] == '/':
|
||||
regex = re.compile(line.strip('/'), re.IGNORECASE)
|
||||
# Found it? so we don't wait for it anymore
|
||||
r = re.search(regex, str(stripped_text_from_html))
|
||||
if r:
|
||||
blocked_by_not_found_trigger_text = False
|
||||
break
|
||||
|
||||
elif line.lower() in str(stripped_text_from_html).lower():
|
||||
# We found it don't wait for it.
|
||||
blocked_by_not_found_trigger_text = False
|
||||
break
|
||||
|
||||
elif line.lower() in str(stripped_text_from_html).lower():
|
||||
# We found it don't wait for it.
|
||||
blocked_by_not_found_trigger_text = False
|
||||
break
|
||||
|
||||
|
||||
# could be None or False depending on JSON type
|
||||
# On the first run of a site, watch['previous_md5'] will be an empty string
|
||||
if not blocked_by_not_found_trigger_text and watch['previous_md5'] != fetched_md5:
|
||||
changed_detected = True
|
||||
|
||||
# Don't confuse people by updating as last-changed, when it actually just changed from None..
|
||||
if self.datastore.get_val(uuid, 'previous_md5'):
|
||||
update_obj["last_changed"] = timestamp
|
||||
|
||||
update_obj["previous_md5"] = fetched_md5
|
||||
|
||||
# Extract title as title
|
||||
if is_html:
|
||||
if self.datastore.data['settings']['application']['extract_title_as_title'] or watch['extract_title_as_title']:
|
||||
if not watch['title'] or not len(watch['title']):
|
||||
update_obj['title'] = html_tools.extract_element(find='title', html_content=fetcher.content)
|
||||
update_obj["last_changed"] = timestamp
|
||||
|
||||
|
||||
return changed_detected, update_obj, stripped_text_from_html
|
||||
# original_content_before_filters is returned for saving the data to disk
|
||||
return changed_detected, update_obj, original_content_before_filters
|
||||
|
||||
@@ -8,6 +8,16 @@ import re
|
||||
|
||||
from changedetectionio.notification import default_notification_format, valid_notification_formats, default_notification_body, default_notification_title
|
||||
|
||||
valid_method = {
|
||||
'GET',
|
||||
'POST',
|
||||
'PUT',
|
||||
'PATCH',
|
||||
'DELETE',
|
||||
}
|
||||
|
||||
default_method = 'GET'
|
||||
|
||||
class StringListField(StringField):
|
||||
widget = widgets.TextArea()
|
||||
|
||||
@@ -106,10 +116,12 @@ class ValidateContentFetcherIsReady(object):
|
||||
except urllib3.exceptions.MaxRetryError as e:
|
||||
driver_url = some_object.command_executor
|
||||
message = field.gettext('Content fetcher \'%s\' did not respond.' % (field.data))
|
||||
message += '<br/>'+field.gettext('Be sure that the selenium/webdriver runner is running and accessible via network from this container/host.')
|
||||
message += '<br/>' + field.gettext(
|
||||
'Be sure that the selenium/webdriver runner is running and accessible via network from this container/host.')
|
||||
message += '<br/>' + field.gettext('Did you follow the instructions in the wiki?')
|
||||
message += '<br/><br/>' + field.gettext('WebDriver Host: %s' % (driver_url))
|
||||
message += '<br/><a href="https://github.com/dgtlmoon/changedetection.io/wiki/Fetching-pages-with-WebDriver">Go here for more information</a>'
|
||||
message += '<br/>'+field.gettext('Content fetcher did not respond properly, unable to use it.\n %s' % (str(e)))
|
||||
|
||||
raise ValidationError(message)
|
||||
|
||||
@@ -118,6 +130,21 @@ class ValidateContentFetcherIsReady(object):
|
||||
raise ValidationError(message % (field.data, e))
|
||||
|
||||
|
||||
class ValidateNotificationBodyAndTitleWhenURLisSet(object):
|
||||
"""
|
||||
Validates that they entered something in both notification title+body when the URL is set
|
||||
Due to https://github.com/dgtlmoon/changedetection.io/issues/360
|
||||
"""
|
||||
|
||||
def __init__(self, message=None):
|
||||
self.message = message
|
||||
|
||||
def __call__(self, form, field):
|
||||
if len(field.data):
|
||||
if not len(form.notification_title.data) or not len(form.notification_body.data):
|
||||
message = field.gettext('Notification Body and Title is required when a Notification URL is used')
|
||||
raise ValidationError(message)
|
||||
|
||||
class ValidateAppRiseServers(object):
|
||||
"""
|
||||
Validates that each URL given is compatible with AppRise
|
||||
@@ -149,7 +176,24 @@ class ValidateTokensList(object):
|
||||
if not p.strip('{}') in notification.valid_tokens:
|
||||
message = field.gettext('Token \'%s\' is not a valid token.')
|
||||
raise ValidationError(message % (p))
|
||||
|
||||
class validateURL(object):
|
||||
|
||||
"""
|
||||
Flask wtform validators wont work with basic auth
|
||||
"""
|
||||
|
||||
def __init__(self, message=None):
|
||||
self.message = message
|
||||
|
||||
def __call__(self, form, field):
|
||||
import validators
|
||||
try:
|
||||
validators.url(field.data.strip())
|
||||
except validators.ValidationFailure:
|
||||
message = field.gettext('\'%s\' is not a valid URL.' % (field.data.strip()))
|
||||
raise ValidationError(message)
|
||||
|
||||
class ValidateListRegex(object):
|
||||
"""
|
||||
Validates that anything that looks like a regex passes as a regex
|
||||
@@ -169,7 +213,7 @@ class ValidateListRegex(object):
|
||||
message = field.gettext('RegEx \'%s\' is not a valid regular expression.')
|
||||
raise ValidationError(message % (line))
|
||||
|
||||
class ValidateCSSJSONInput(object):
|
||||
class ValidateCSSJSONXPATHInput(object):
|
||||
"""
|
||||
Filter validation
|
||||
@todo CSS validator ;)
|
||||
@@ -179,6 +223,24 @@ class ValidateCSSJSONInput(object):
|
||||
self.message = message
|
||||
|
||||
def __call__(self, form, field):
|
||||
|
||||
# Nothing to see here
|
||||
if not len(field.data.strip()):
|
||||
return
|
||||
|
||||
# Does it look like XPath?
|
||||
if field.data.strip()[0] == '/':
|
||||
from lxml import html, etree
|
||||
tree = html.fromstring("<html></html>")
|
||||
|
||||
try:
|
||||
tree.xpath(field.data.strip())
|
||||
except etree.XPathEvalError as e:
|
||||
message = field.gettext('\'%s\' is not a valid XPath expression. (%s)')
|
||||
raise ValidationError(message % (field.data, str(e)))
|
||||
except:
|
||||
raise ValidationError("A system-error occurred when validating your XPath expression")
|
||||
|
||||
if 'json:' in field.data:
|
||||
from jsonpath_ng.exceptions import JsonPathParserError, JsonPathLexerError
|
||||
from jsonpath_ng.ext import parse
|
||||
@@ -190,6 +252,8 @@ class ValidateCSSJSONInput(object):
|
||||
except (JsonPathParserError, JsonPathLexerError) as e:
|
||||
message = field.gettext('\'%s\' is not a valid JSONPath expression. (%s)')
|
||||
raise ValidationError(message % (input, str(e)))
|
||||
except:
|
||||
raise ValidationError("A system-error occurred when validating your JSONPath expression")
|
||||
|
||||
# Re #265 - maybe in the future fetch the page and offer a
|
||||
# warning/notice that its possible the rule doesnt yet match anything?
|
||||
@@ -197,12 +261,12 @@ class ValidateCSSJSONInput(object):
|
||||
class quickWatchForm(Form):
|
||||
# https://wtforms.readthedocs.io/en/2.3.x/fields/#module-wtforms.fields.html5
|
||||
# `require_tld` = False is needed even for the test harness "http://localhost:5005.." to run
|
||||
url = html5.URLField('URL', [validators.URL(require_tld=False)])
|
||||
url = html5.URLField('URL', validators=[validateURL()])
|
||||
tag = StringField('Group tag', [validators.Optional(), validators.Length(max=35)])
|
||||
|
||||
class commonSettingsForm(Form):
|
||||
|
||||
notification_urls = StringListField('Notification URL List', validators=[validators.Optional(), ValidateAppRiseServers()])
|
||||
notification_urls = StringListField('Notification URL List', validators=[validators.Optional(), ValidateNotificationBodyAndTitleWhenURLisSet(), ValidateAppRiseServers()])
|
||||
notification_title = StringField('Notification Title', default=default_notification_title, validators=[validators.Optional(), ValidateTokensList()])
|
||||
notification_body = TextAreaField('Notification Body', default=default_notification_body, validators=[validators.Optional(), ValidateTokensList()])
|
||||
notification_format = SelectField('Notification Format', choices=valid_notification_formats.keys(), default=default_notification_format)
|
||||
@@ -212,18 +276,32 @@ class commonSettingsForm(Form):
|
||||
|
||||
class watchForm(commonSettingsForm):
|
||||
|
||||
url = html5.URLField('URL', [validators.URL(require_tld=False)])
|
||||
url = html5.URLField('URL', validators=[validateURL()])
|
||||
tag = StringField('Group tag', [validators.Optional(), validators.Length(max=35)])
|
||||
|
||||
minutes_between_check = html5.IntegerField('Maximum time in minutes until recheck',
|
||||
[validators.Optional(), validators.NumberRange(min=1)])
|
||||
css_filter = StringField('CSS/JSON Filter', [ValidateCSSJSONInput()])
|
||||
css_filter = StringField('CSS/JSON/XPATH Filter', [ValidateCSSJSONXPATHInput()])
|
||||
title = StringField('Title')
|
||||
|
||||
ignore_text = StringListField('Ignore Text', [ValidateListRegex()])
|
||||
headers = StringDictKeyValue('Request Headers')
|
||||
body = TextAreaField('Request Body', [validators.Optional()])
|
||||
method = SelectField('Request Method', choices=valid_method, default=default_method)
|
||||
trigger_text = StringListField('Trigger/wait for text', [validators.Optional(), ValidateListRegex()])
|
||||
|
||||
def validate(self, **kwargs):
|
||||
if not super().validate():
|
||||
return False
|
||||
|
||||
result = True
|
||||
|
||||
# Fail form validation when a body is set for a GET
|
||||
if self.method.data == 'GET' and self.body.data:
|
||||
self.body.errors.append('Body must be empty when Request Method is set to GET')
|
||||
result = False
|
||||
|
||||
return result
|
||||
|
||||
class globalSettingsForm(commonSettingsForm):
|
||||
|
||||
@@ -232,3 +310,5 @@ class globalSettingsForm(commonSettingsForm):
|
||||
[validators.NumberRange(min=1)])
|
||||
extract_title_as_title = BooleanField('Extract <title> from document and use as watch title')
|
||||
base_url = StringField('Base URL', validators=[validators.Optional()])
|
||||
global_ignore_text = StringListField('Ignore Text', [ValidateListRegex()])
|
||||
ignore_whitespace = BooleanField('Ignore whitespace')
|
||||
@@ -17,6 +17,20 @@ def css_filter(css_filter, html_content):
|
||||
return html_block + "\n"
|
||||
|
||||
|
||||
# Return str Utf-8 of matched rules
|
||||
def xpath_filter(xpath_filter, html_content):
|
||||
from lxml import html
|
||||
from lxml import etree
|
||||
|
||||
tree = html.fromstring(html_content)
|
||||
html_block = ""
|
||||
|
||||
for item in tree.xpath(xpath_filter.strip()):
|
||||
html_block+= etree.tostring(item, pretty_print=True).decode('utf-8')+"<br/>"
|
||||
|
||||
return html_block
|
||||
|
||||
|
||||
# Extract/find element
|
||||
def extract_element(find='title', html_content=''):
|
||||
|
||||
|
||||
41
changedetectionio/image_diff.py
Normal file
41
changedetectionio/image_diff.py
Normal file
@@ -0,0 +1,41 @@
|
||||
# import the necessary packages
|
||||
from skimage.metrics import structural_similarity as compare_ssim
|
||||
import argparse
|
||||
import imutils
|
||||
import cv2
|
||||
|
||||
# From https://www.pyimagesearch.com/2017/06/19/image-difference-with-opencv-and-python/
|
||||
def render_diff(fpath_imageA, fpath_imageB):
|
||||
|
||||
imageA = cv2.imread(fpath_imageA)
|
||||
imageB = cv2.imread(fpath_imageB)
|
||||
|
||||
# convert the images to grayscale
|
||||
grayA = cv2.cvtColor(imageA, cv2.COLOR_BGR2GRAY)
|
||||
grayB = cv2.cvtColor(imageB, cv2.COLOR_BGR2GRAY)
|
||||
|
||||
# compute the Structural Similarity Index (SSIM) between the two
|
||||
# images, ensuring that the difference image is returned
|
||||
(score, diff) = compare_ssim(grayA, grayB, full=True)
|
||||
diff = (diff * 255).astype("uint8")
|
||||
print("SSIM: {}".format(score))
|
||||
|
||||
# threshold the difference image, followed by finding contours to
|
||||
# obtain the regions of the two input images that differ
|
||||
thresh = cv2.threshold(diff, 0, 255,
|
||||
cv2.THRESH_BINARY_INV | cv2.THRESH_OTSU)[1]
|
||||
cnts = cv2.findContours(thresh.copy(), cv2.RETR_EXTERNAL,
|
||||
cv2.CHAIN_APPROX_SIMPLE)
|
||||
cnts = imutils.grab_contours(cnts)
|
||||
|
||||
# loop over the contours
|
||||
for c in cnts:
|
||||
# compute the bounding box of the contour and then draw the
|
||||
# bounding box on both input images to represent where the two
|
||||
# images differ
|
||||
(x, y, w, h) = cv2.boundingRect(c)
|
||||
cv2.rectangle(imageA, (x, y), (x + w, y + h), (0, 0, 255), 2)
|
||||
cv2.rectangle(imageB, (x, y), (x + w, y + h), (0, 0, 255), 2)
|
||||
|
||||
#return cv2.imencode('.jpg', imageB)[1].tobytes()
|
||||
return cv2.imencode('.jpg', imageA)[1].tobytes()
|
||||
@@ -25,9 +25,7 @@ default_notification_body = '{watch_url} had a change.\n---\n{diff}\n---\n'
|
||||
default_notification_title = 'ChangeDetection.io Notification - {watch_url}'
|
||||
|
||||
def process_notification(n_object, datastore):
|
||||
import logging
|
||||
log = logging.getLogger('apprise')
|
||||
log.setLevel('TRACE')
|
||||
|
||||
apobj = apprise.Apprise(debug=True)
|
||||
|
||||
for url in n_object['notification_urls']:
|
||||
@@ -53,11 +51,22 @@ def process_notification(n_object, datastore):
|
||||
n_title = n_title.replace(token, val)
|
||||
n_body = n_body.replace(token, val)
|
||||
|
||||
apobj.notify(
|
||||
# https://github.com/caronc/apprise/wiki/Development_LogCapture
|
||||
# Anything higher than or equal to WARNING (which covers things like Connection errors)
|
||||
# raise it as an exception
|
||||
|
||||
with apprise.LogCapture(level=apprise.logging.DEBUG) as logs:
|
||||
apobj.notify(
|
||||
body=n_body,
|
||||
title=n_title,
|
||||
body_format=n_format,
|
||||
)
|
||||
body_format=n_format)
|
||||
|
||||
# Returns empty string if nothing found, multi-line string otherwise
|
||||
log_value = logs.getvalue()
|
||||
if log_value and 'WARNING' in log_value or 'ERROR' in log_value:
|
||||
raise Exception(log_value)
|
||||
|
||||
|
||||
|
||||
# Notification title + body content parameters get created here.
|
||||
def create_notification_parameters(n_object, datastore):
|
||||
|
||||
355
changedetectionio/static/styles/package-lock.json
generated
355
changedetectionio/static/styles/package-lock.json
generated
@@ -9,7 +9,7 @@
|
||||
"version": "0.0.3",
|
||||
"license": "ISC",
|
||||
"dependencies": {
|
||||
"node-sass": "^6.0.1",
|
||||
"node-sass": "^7.0.0",
|
||||
"tar": "^6.1.9",
|
||||
"trim-newlines": "^3.0.1"
|
||||
}
|
||||
@@ -128,13 +128,35 @@
|
||||
}
|
||||
},
|
||||
"node_modules/ansi-styles": {
|
||||
"version": "2.2.1",
|
||||
"resolved": "https://registry.npmjs.org/ansi-styles/-/ansi-styles-2.2.1.tgz",
|
||||
"integrity": "sha1-tDLdM1i2NM914eRmQ2gkBTPB3b4=",
|
||||
"version": "4.3.0",
|
||||
"resolved": "https://registry.npmjs.org/ansi-styles/-/ansi-styles-4.3.0.tgz",
|
||||
"integrity": "sha512-zbB9rCJAT1rbjiVDb2hqKFHNYLxgtk8NURxZ3IZwD3F6NtxbXZQCnnSi1Lkx+IDohdPlFp222wVALIheZJQSEg==",
|
||||
"dependencies": {
|
||||
"color-convert": "^2.0.1"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=0.10.0"
|
||||
"node": ">=8"
|
||||
},
|
||||
"funding": {
|
||||
"url": "https://github.com/chalk/ansi-styles?sponsor=1"
|
||||
}
|
||||
},
|
||||
"node_modules/ansi-styles/node_modules/color-convert": {
|
||||
"version": "2.0.1",
|
||||
"resolved": "https://registry.npmjs.org/color-convert/-/color-convert-2.0.1.tgz",
|
||||
"integrity": "sha512-RRECPsj7iu/xb5oKYcsFHSppFNnsj/52OVTRKb4zP5onXwVF3zVmmToNcOfGC+CRDpfK/U584fMg38ZHCaElKQ==",
|
||||
"dependencies": {
|
||||
"color-name": "~1.1.4"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=7.0.0"
|
||||
}
|
||||
},
|
||||
"node_modules/ansi-styles/node_modules/color-name": {
|
||||
"version": "1.1.4",
|
||||
"resolved": "https://registry.npmjs.org/color-name/-/color-name-1.1.4.tgz",
|
||||
"integrity": "sha512-dOy+3AuW3a2wNbZHIuMZpTcgjGuLU/uBL/ubcZF9OXbDo8ff4O8yVp5Bf0efS8uEoYo5q4Fx7dY9OgQGXgAsQA=="
|
||||
},
|
||||
"node_modules/aproba": {
|
||||
"version": "1.2.0",
|
||||
"resolved": "https://registry.npmjs.org/aproba/-/aproba-1.2.0.tgz",
|
||||
@@ -251,18 +273,18 @@
|
||||
"integrity": "sha1-G2gcIf+EAzyCZUMJBolCDRhxUdw="
|
||||
},
|
||||
"node_modules/chalk": {
|
||||
"version": "1.1.3",
|
||||
"resolved": "https://registry.npmjs.org/chalk/-/chalk-1.1.3.tgz",
|
||||
"integrity": "sha1-qBFcVeSnAv5NFQq9OHKCKn4J/Jg=",
|
||||
"version": "4.1.2",
|
||||
"resolved": "https://registry.npmjs.org/chalk/-/chalk-4.1.2.tgz",
|
||||
"integrity": "sha512-oKnbhFyRIXpUuez8iBMmyEa4nbj4IOQyuhc/wy9kY7/WVPcwIO9VA668Pu8RkO7+0G76SLROeyw9CpQ061i4mA==",
|
||||
"dependencies": {
|
||||
"ansi-styles": "^2.2.1",
|
||||
"escape-string-regexp": "^1.0.2",
|
||||
"has-ansi": "^2.0.0",
|
||||
"strip-ansi": "^3.0.0",
|
||||
"supports-color": "^2.0.0"
|
||||
"ansi-styles": "^4.1.0",
|
||||
"supports-color": "^7.1.0"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=0.10.0"
|
||||
"node": ">=10"
|
||||
},
|
||||
"funding": {
|
||||
"url": "https://github.com/chalk/chalk?sponsor=1"
|
||||
}
|
||||
},
|
||||
"node_modules/chownr": {
|
||||
@@ -344,6 +366,14 @@
|
||||
"resolved": "https://registry.npmjs.org/color-name/-/color-name-1.1.3.tgz",
|
||||
"integrity": "sha1-p9BVi9icQveV3UIyj3QIMcpTvCU="
|
||||
},
|
||||
"node_modules/color-support": {
|
||||
"version": "1.1.3",
|
||||
"resolved": "https://registry.npmjs.org/color-support/-/color-support-1.1.3.tgz",
|
||||
"integrity": "sha512-qiBjkpbMLO/HL68y+lh4q0/O1MZFj2RX6X/KmMa3+gJD3z+WwI1ZzDHysvqHGS3mP6mznPckpXmw1nI9cJjyRg==",
|
||||
"bin": {
|
||||
"color-support": "bin.js"
|
||||
}
|
||||
},
|
||||
"node_modules/combined-stream": {
|
||||
"version": "1.0.8",
|
||||
"resolved": "https://registry.npmjs.org/combined-stream/-/combined-stream-1.0.8.tgz",
|
||||
@@ -677,17 +707,6 @@
|
||||
"node": ">= 0.4.0"
|
||||
}
|
||||
},
|
||||
"node_modules/has-ansi": {
|
||||
"version": "2.0.0",
|
||||
"resolved": "https://registry.npmjs.org/has-ansi/-/has-ansi-2.0.0.tgz",
|
||||
"integrity": "sha1-NPUEnOHs3ysGSa8+8k5F7TVBbZE=",
|
||||
"dependencies": {
|
||||
"ansi-regex": "^2.0.0"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=0.10.0"
|
||||
}
|
||||
},
|
||||
"node_modules/has-flag": {
|
||||
"version": "3.0.0",
|
||||
"resolved": "https://registry.npmjs.org/has-flag/-/has-flag-3.0.0.tgz",
|
||||
@@ -1042,13 +1061,13 @@
|
||||
}
|
||||
},
|
||||
"node_modules/node-sass": {
|
||||
"version": "6.0.1",
|
||||
"resolved": "https://registry.npmjs.org/node-sass/-/node-sass-6.0.1.tgz",
|
||||
"integrity": "sha512-f+Rbqt92Ful9gX0cGtdYwjTrWAaGURgaK5rZCWOgCNyGWusFYHhbqCCBoFBeat+HKETOU02AyTxNhJV0YZf2jQ==",
|
||||
"version": "7.0.0",
|
||||
"resolved": "https://registry.npmjs.org/node-sass/-/node-sass-7.0.0.tgz",
|
||||
"integrity": "sha512-6yUnsD3L8fVbgMX6nKQqZkjRcG7a/PpmF0pEyeWf+BgbTj2ToJlCYrnUifL2KbjV5gIY22I3oppahBWA3B+jUg==",
|
||||
"hasInstallScript": true,
|
||||
"dependencies": {
|
||||
"async-foreach": "^0.1.3",
|
||||
"chalk": "^1.1.1",
|
||||
"chalk": "^4.1.2",
|
||||
"cross-spawn": "^7.0.3",
|
||||
"gaze": "^1.0.0",
|
||||
"get-stdin": "^4.0.1",
|
||||
@@ -1057,7 +1076,7 @@
|
||||
"meow": "^9.0.0",
|
||||
"nan": "^2.13.2",
|
||||
"node-gyp": "^7.1.0",
|
||||
"npmlog": "^4.0.0",
|
||||
"npmlog": "^5.0.0",
|
||||
"request": "^2.88.0",
|
||||
"sass-graph": "2.2.5",
|
||||
"stdout-stream": "^1.4.0",
|
||||
@@ -1070,6 +1089,106 @@
|
||||
"node": ">=12"
|
||||
}
|
||||
},
|
||||
"node_modules/node-sass/node_modules/ansi-regex": {
|
||||
"version": "5.0.1",
|
||||
"resolved": "https://registry.npmjs.org/ansi-regex/-/ansi-regex-5.0.1.tgz",
|
||||
"integrity": "sha512-quJQXlTSUGL2LH9SUXo8VwsY4soanhgo6LNSm84E1LBcE8s3O0wpdiRzyR9z/ZZJMlMWv37qOOb9pdJlMUEKFQ==",
|
||||
"engines": {
|
||||
"node": ">=8"
|
||||
}
|
||||
},
|
||||
"node_modules/node-sass/node_modules/are-we-there-yet": {
|
||||
"version": "2.0.0",
|
||||
"resolved": "https://registry.npmjs.org/are-we-there-yet/-/are-we-there-yet-2.0.0.tgz",
|
||||
"integrity": "sha512-Ci/qENmwHnsYo9xKIcUJN5LeDKdJ6R1Z1j9V/J5wyq8nh/mYPEpIKJbBZXtZjG04HiK7zV/p6Vs9952MrMeUIw==",
|
||||
"dependencies": {
|
||||
"delegates": "^1.0.0",
|
||||
"readable-stream": "^3.6.0"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=10"
|
||||
}
|
||||
},
|
||||
"node_modules/node-sass/node_modules/emoji-regex": {
|
||||
"version": "8.0.0",
|
||||
"resolved": "https://registry.npmjs.org/emoji-regex/-/emoji-regex-8.0.0.tgz",
|
||||
"integrity": "sha512-MSjYzcWNOA0ewAHpz0MxpYFvwg6yjy1NG3xteoqz644VCo/RPgnr1/GGt+ic3iJTzQ8Eu3TdM14SawnVUmGE6A=="
|
||||
},
|
||||
"node_modules/node-sass/node_modules/gauge": {
|
||||
"version": "3.0.2",
|
||||
"resolved": "https://registry.npmjs.org/gauge/-/gauge-3.0.2.tgz",
|
||||
"integrity": "sha512-+5J6MS/5XksCuXq++uFRsnUd7Ovu1XenbeuIuNRJxYWjgQbPuFhT14lAvsWfqfAmnwluf1OwMjz39HjfLPci0Q==",
|
||||
"dependencies": {
|
||||
"aproba": "^1.0.3 || ^2.0.0",
|
||||
"color-support": "^1.1.2",
|
||||
"console-control-strings": "^1.0.0",
|
||||
"has-unicode": "^2.0.1",
|
||||
"object-assign": "^4.1.1",
|
||||
"signal-exit": "^3.0.0",
|
||||
"string-width": "^4.2.3",
|
||||
"strip-ansi": "^6.0.1",
|
||||
"wide-align": "^1.1.2"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=10"
|
||||
}
|
||||
},
|
||||
"node_modules/node-sass/node_modules/is-fullwidth-code-point": {
|
||||
"version": "3.0.0",
|
||||
"resolved": "https://registry.npmjs.org/is-fullwidth-code-point/-/is-fullwidth-code-point-3.0.0.tgz",
|
||||
"integrity": "sha512-zymm5+u+sCsSWyD9qNaejV3DFvhCKclKdizYaJUuHA83RLjb7nSuGnddCHGv0hk+KY7BMAlsWeK4Ueg6EV6XQg==",
|
||||
"engines": {
|
||||
"node": ">=8"
|
||||
}
|
||||
},
|
||||
"node_modules/node-sass/node_modules/npmlog": {
|
||||
"version": "5.0.1",
|
||||
"resolved": "https://registry.npmjs.org/npmlog/-/npmlog-5.0.1.tgz",
|
||||
"integrity": "sha512-AqZtDUWOMKs1G/8lwylVjrdYgqA4d9nu8hc+0gzRxlDb1I10+FHBGMXs6aiQHFdCUUlqH99MUMuLfzWDNDtfxw==",
|
||||
"dependencies": {
|
||||
"are-we-there-yet": "^2.0.0",
|
||||
"console-control-strings": "^1.1.0",
|
||||
"gauge": "^3.0.0",
|
||||
"set-blocking": "^2.0.0"
|
||||
}
|
||||
},
|
||||
"node_modules/node-sass/node_modules/readable-stream": {
|
||||
"version": "3.6.0",
|
||||
"resolved": "https://registry.npmjs.org/readable-stream/-/readable-stream-3.6.0.tgz",
|
||||
"integrity": "sha512-BViHy7LKeTz4oNnkcLJ+lVSL6vpiFeX6/d3oSH8zCW7UxP2onchk+vTGB143xuFjHS3deTgkKoXXymXqymiIdA==",
|
||||
"dependencies": {
|
||||
"inherits": "^2.0.3",
|
||||
"string_decoder": "^1.1.1",
|
||||
"util-deprecate": "^1.0.1"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">= 6"
|
||||
}
|
||||
},
|
||||
"node_modules/node-sass/node_modules/string-width": {
|
||||
"version": "4.2.3",
|
||||
"resolved": "https://registry.npmjs.org/string-width/-/string-width-4.2.3.tgz",
|
||||
"integrity": "sha512-wKyQRQpjJ0sIp62ErSZdGsjMJWsap5oRNihHhu6G7JVO/9jIB6UyevL+tXuOqrng8j/cxKTWyWUwvSTriiZz/g==",
|
||||
"dependencies": {
|
||||
"emoji-regex": "^8.0.0",
|
||||
"is-fullwidth-code-point": "^3.0.0",
|
||||
"strip-ansi": "^6.0.1"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=8"
|
||||
}
|
||||
},
|
||||
"node_modules/node-sass/node_modules/strip-ansi": {
|
||||
"version": "6.0.1",
|
||||
"resolved": "https://registry.npmjs.org/strip-ansi/-/strip-ansi-6.0.1.tgz",
|
||||
"integrity": "sha512-Y38VPSHcqkFrCpFnQ9vuSXmquuv5oXOKpGeT6aGrr3o3Gc9AlVa6JBfUSOCnbxGGZF+/0ooI7KrPuUSztUdU5A==",
|
||||
"dependencies": {
|
||||
"ansi-regex": "^5.0.1"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=8"
|
||||
}
|
||||
},
|
||||
"node_modules/nopt": {
|
||||
"version": "5.0.0",
|
||||
"resolved": "https://registry.npmjs.org/nopt/-/nopt-5.0.0.tgz",
|
||||
@@ -1616,11 +1735,22 @@
|
||||
}
|
||||
},
|
||||
"node_modules/supports-color": {
|
||||
"version": "2.0.0",
|
||||
"resolved": "https://registry.npmjs.org/supports-color/-/supports-color-2.0.0.tgz",
|
||||
"integrity": "sha1-U10EXOa2Nj+kARcIRimZXp3zJMc=",
|
||||
"version": "7.2.0",
|
||||
"resolved": "https://registry.npmjs.org/supports-color/-/supports-color-7.2.0.tgz",
|
||||
"integrity": "sha512-qpCAvRl9stuOHveKsn7HncJRvv501qIacKzQlO/+Lwxc9+0q2wLyv4Dfvt80/DPn2pqOBsJdDiogXGR9+OvwRw==",
|
||||
"dependencies": {
|
||||
"has-flag": "^4.0.0"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=0.8.0"
|
||||
"node": ">=8"
|
||||
}
|
||||
},
|
||||
"node_modules/supports-color/node_modules/has-flag": {
|
||||
"version": "4.0.0",
|
||||
"resolved": "https://registry.npmjs.org/has-flag/-/has-flag-4.0.0.tgz",
|
||||
"integrity": "sha512-EykJT/Q1KjTWctppgIAgfSO0tKVuZUjhgMr17kqTumMl6Afv3EISleU7qZUzoXDFTAHTDC4NOoG/ZxU3EvlMPQ==",
|
||||
"engines": {
|
||||
"node": ">=8"
|
||||
}
|
||||
},
|
||||
"node_modules/tar": {
|
||||
@@ -2050,9 +2180,27 @@
|
||||
"integrity": "sha1-w7M6te42DYbg5ijwRorn7yfWVN8="
|
||||
},
|
||||
"ansi-styles": {
|
||||
"version": "2.2.1",
|
||||
"resolved": "https://registry.npmjs.org/ansi-styles/-/ansi-styles-2.2.1.tgz",
|
||||
"integrity": "sha1-tDLdM1i2NM914eRmQ2gkBTPB3b4="
|
||||
"version": "4.3.0",
|
||||
"resolved": "https://registry.npmjs.org/ansi-styles/-/ansi-styles-4.3.0.tgz",
|
||||
"integrity": "sha512-zbB9rCJAT1rbjiVDb2hqKFHNYLxgtk8NURxZ3IZwD3F6NtxbXZQCnnSi1Lkx+IDohdPlFp222wVALIheZJQSEg==",
|
||||
"requires": {
|
||||
"color-convert": "^2.0.1"
|
||||
},
|
||||
"dependencies": {
|
||||
"color-convert": {
|
||||
"version": "2.0.1",
|
||||
"resolved": "https://registry.npmjs.org/color-convert/-/color-convert-2.0.1.tgz",
|
||||
"integrity": "sha512-RRECPsj7iu/xb5oKYcsFHSppFNnsj/52OVTRKb4zP5onXwVF3zVmmToNcOfGC+CRDpfK/U584fMg38ZHCaElKQ==",
|
||||
"requires": {
|
||||
"color-name": "~1.1.4"
|
||||
}
|
||||
},
|
||||
"color-name": {
|
||||
"version": "1.1.4",
|
||||
"resolved": "https://registry.npmjs.org/color-name/-/color-name-1.1.4.tgz",
|
||||
"integrity": "sha512-dOy+3AuW3a2wNbZHIuMZpTcgjGuLU/uBL/ubcZF9OXbDo8ff4O8yVp5Bf0efS8uEoYo5q4Fx7dY9OgQGXgAsQA=="
|
||||
}
|
||||
}
|
||||
},
|
||||
"aproba": {
|
||||
"version": "1.2.0",
|
||||
@@ -2149,15 +2297,12 @@
|
||||
"integrity": "sha1-G2gcIf+EAzyCZUMJBolCDRhxUdw="
|
||||
},
|
||||
"chalk": {
|
||||
"version": "1.1.3",
|
||||
"resolved": "https://registry.npmjs.org/chalk/-/chalk-1.1.3.tgz",
|
||||
"integrity": "sha1-qBFcVeSnAv5NFQq9OHKCKn4J/Jg=",
|
||||
"version": "4.1.2",
|
||||
"resolved": "https://registry.npmjs.org/chalk/-/chalk-4.1.2.tgz",
|
||||
"integrity": "sha512-oKnbhFyRIXpUuez8iBMmyEa4nbj4IOQyuhc/wy9kY7/WVPcwIO9VA668Pu8RkO7+0G76SLROeyw9CpQ061i4mA==",
|
||||
"requires": {
|
||||
"ansi-styles": "^2.2.1",
|
||||
"escape-string-regexp": "^1.0.2",
|
||||
"has-ansi": "^2.0.0",
|
||||
"strip-ansi": "^3.0.0",
|
||||
"supports-color": "^2.0.0"
|
||||
"ansi-styles": "^4.1.0",
|
||||
"supports-color": "^7.1.0"
|
||||
}
|
||||
},
|
||||
"chownr": {
|
||||
@@ -2223,6 +2368,11 @@
|
||||
"resolved": "https://registry.npmjs.org/color-name/-/color-name-1.1.3.tgz",
|
||||
"integrity": "sha1-p9BVi9icQveV3UIyj3QIMcpTvCU="
|
||||
},
|
||||
"color-support": {
|
||||
"version": "1.1.3",
|
||||
"resolved": "https://registry.npmjs.org/color-support/-/color-support-1.1.3.tgz",
|
||||
"integrity": "sha512-qiBjkpbMLO/HL68y+lh4q0/O1MZFj2RX6X/KmMa3+gJD3z+WwI1ZzDHysvqHGS3mP6mznPckpXmw1nI9cJjyRg=="
|
||||
},
|
||||
"combined-stream": {
|
||||
"version": "1.0.8",
|
||||
"resolved": "https://registry.npmjs.org/combined-stream/-/combined-stream-1.0.8.tgz",
|
||||
@@ -2485,14 +2635,6 @@
|
||||
"function-bind": "^1.1.1"
|
||||
}
|
||||
},
|
||||
"has-ansi": {
|
||||
"version": "2.0.0",
|
||||
"resolved": "https://registry.npmjs.org/has-ansi/-/has-ansi-2.0.0.tgz",
|
||||
"integrity": "sha1-NPUEnOHs3ysGSa8+8k5F7TVBbZE=",
|
||||
"requires": {
|
||||
"ansi-regex": "^2.0.0"
|
||||
}
|
||||
},
|
||||
"has-flag": {
|
||||
"version": "3.0.0",
|
||||
"resolved": "https://registry.npmjs.org/has-flag/-/has-flag-3.0.0.tgz",
|
||||
@@ -2768,12 +2910,12 @@
|
||||
}
|
||||
},
|
||||
"node-sass": {
|
||||
"version": "6.0.1",
|
||||
"resolved": "https://registry.npmjs.org/node-sass/-/node-sass-6.0.1.tgz",
|
||||
"integrity": "sha512-f+Rbqt92Ful9gX0cGtdYwjTrWAaGURgaK5rZCWOgCNyGWusFYHhbqCCBoFBeat+HKETOU02AyTxNhJV0YZf2jQ==",
|
||||
"version": "7.0.0",
|
||||
"resolved": "https://registry.npmjs.org/node-sass/-/node-sass-7.0.0.tgz",
|
||||
"integrity": "sha512-6yUnsD3L8fVbgMX6nKQqZkjRcG7a/PpmF0pEyeWf+BgbTj2ToJlCYrnUifL2KbjV5gIY22I3oppahBWA3B+jUg==",
|
||||
"requires": {
|
||||
"async-foreach": "^0.1.3",
|
||||
"chalk": "^1.1.1",
|
||||
"chalk": "^4.1.2",
|
||||
"cross-spawn": "^7.0.3",
|
||||
"gaze": "^1.0.0",
|
||||
"get-stdin": "^4.0.1",
|
||||
@@ -2782,11 +2924,92 @@
|
||||
"meow": "^9.0.0",
|
||||
"nan": "^2.13.2",
|
||||
"node-gyp": "^7.1.0",
|
||||
"npmlog": "^4.0.0",
|
||||
"npmlog": "^5.0.0",
|
||||
"request": "^2.88.0",
|
||||
"sass-graph": "2.2.5",
|
||||
"stdout-stream": "^1.4.0",
|
||||
"true-case-path": "^1.0.2"
|
||||
},
|
||||
"dependencies": {
|
||||
"ansi-regex": {
|
||||
"version": "5.0.1",
|
||||
"resolved": "https://registry.npmjs.org/ansi-regex/-/ansi-regex-5.0.1.tgz",
|
||||
"integrity": "sha512-quJQXlTSUGL2LH9SUXo8VwsY4soanhgo6LNSm84E1LBcE8s3O0wpdiRzyR9z/ZZJMlMWv37qOOb9pdJlMUEKFQ=="
|
||||
},
|
||||
"are-we-there-yet": {
|
||||
"version": "2.0.0",
|
||||
"resolved": "https://registry.npmjs.org/are-we-there-yet/-/are-we-there-yet-2.0.0.tgz",
|
||||
"integrity": "sha512-Ci/qENmwHnsYo9xKIcUJN5LeDKdJ6R1Z1j9V/J5wyq8nh/mYPEpIKJbBZXtZjG04HiK7zV/p6Vs9952MrMeUIw==",
|
||||
"requires": {
|
||||
"delegates": "^1.0.0",
|
||||
"readable-stream": "^3.6.0"
|
||||
}
|
||||
},
|
||||
"emoji-regex": {
|
||||
"version": "8.0.0",
|
||||
"resolved": "https://registry.npmjs.org/emoji-regex/-/emoji-regex-8.0.0.tgz",
|
||||
"integrity": "sha512-MSjYzcWNOA0ewAHpz0MxpYFvwg6yjy1NG3xteoqz644VCo/RPgnr1/GGt+ic3iJTzQ8Eu3TdM14SawnVUmGE6A=="
|
||||
},
|
||||
"gauge": {
|
||||
"version": "3.0.2",
|
||||
"resolved": "https://registry.npmjs.org/gauge/-/gauge-3.0.2.tgz",
|
||||
"integrity": "sha512-+5J6MS/5XksCuXq++uFRsnUd7Ovu1XenbeuIuNRJxYWjgQbPuFhT14lAvsWfqfAmnwluf1OwMjz39HjfLPci0Q==",
|
||||
"requires": {
|
||||
"aproba": "^1.0.3 || ^2.0.0",
|
||||
"color-support": "^1.1.2",
|
||||
"console-control-strings": "^1.0.0",
|
||||
"has-unicode": "^2.0.1",
|
||||
"object-assign": "^4.1.1",
|
||||
"signal-exit": "^3.0.0",
|
||||
"string-width": "^4.2.3",
|
||||
"strip-ansi": "^6.0.1",
|
||||
"wide-align": "^1.1.2"
|
||||
}
|
||||
},
|
||||
"is-fullwidth-code-point": {
|
||||
"version": "3.0.0",
|
||||
"resolved": "https://registry.npmjs.org/is-fullwidth-code-point/-/is-fullwidth-code-point-3.0.0.tgz",
|
||||
"integrity": "sha512-zymm5+u+sCsSWyD9qNaejV3DFvhCKclKdizYaJUuHA83RLjb7nSuGnddCHGv0hk+KY7BMAlsWeK4Ueg6EV6XQg=="
|
||||
},
|
||||
"npmlog": {
|
||||
"version": "5.0.1",
|
||||
"resolved": "https://registry.npmjs.org/npmlog/-/npmlog-5.0.1.tgz",
|
||||
"integrity": "sha512-AqZtDUWOMKs1G/8lwylVjrdYgqA4d9nu8hc+0gzRxlDb1I10+FHBGMXs6aiQHFdCUUlqH99MUMuLfzWDNDtfxw==",
|
||||
"requires": {
|
||||
"are-we-there-yet": "^2.0.0",
|
||||
"console-control-strings": "^1.1.0",
|
||||
"gauge": "^3.0.0",
|
||||
"set-blocking": "^2.0.0"
|
||||
}
|
||||
},
|
||||
"readable-stream": {
|
||||
"version": "3.6.0",
|
||||
"resolved": "https://registry.npmjs.org/readable-stream/-/readable-stream-3.6.0.tgz",
|
||||
"integrity": "sha512-BViHy7LKeTz4oNnkcLJ+lVSL6vpiFeX6/d3oSH8zCW7UxP2onchk+vTGB143xuFjHS3deTgkKoXXymXqymiIdA==",
|
||||
"requires": {
|
||||
"inherits": "^2.0.3",
|
||||
"string_decoder": "^1.1.1",
|
||||
"util-deprecate": "^1.0.1"
|
||||
}
|
||||
},
|
||||
"string-width": {
|
||||
"version": "4.2.3",
|
||||
"resolved": "https://registry.npmjs.org/string-width/-/string-width-4.2.3.tgz",
|
||||
"integrity": "sha512-wKyQRQpjJ0sIp62ErSZdGsjMJWsap5oRNihHhu6G7JVO/9jIB6UyevL+tXuOqrng8j/cxKTWyWUwvSTriiZz/g==",
|
||||
"requires": {
|
||||
"emoji-regex": "^8.0.0",
|
||||
"is-fullwidth-code-point": "^3.0.0",
|
||||
"strip-ansi": "^6.0.1"
|
||||
}
|
||||
},
|
||||
"strip-ansi": {
|
||||
"version": "6.0.1",
|
||||
"resolved": "https://registry.npmjs.org/strip-ansi/-/strip-ansi-6.0.1.tgz",
|
||||
"integrity": "sha512-Y38VPSHcqkFrCpFnQ9vuSXmquuv5oXOKpGeT6aGrr3o3Gc9AlVa6JBfUSOCnbxGGZF+/0ooI7KrPuUSztUdU5A==",
|
||||
"requires": {
|
||||
"ansi-regex": "^5.0.1"
|
||||
}
|
||||
}
|
||||
}
|
||||
},
|
||||
"nopt": {
|
||||
@@ -3213,9 +3436,19 @@
|
||||
}
|
||||
},
|
||||
"supports-color": {
|
||||
"version": "2.0.0",
|
||||
"resolved": "https://registry.npmjs.org/supports-color/-/supports-color-2.0.0.tgz",
|
||||
"integrity": "sha1-U10EXOa2Nj+kARcIRimZXp3zJMc="
|
||||
"version": "7.2.0",
|
||||
"resolved": "https://registry.npmjs.org/supports-color/-/supports-color-7.2.0.tgz",
|
||||
"integrity": "sha512-qpCAvRl9stuOHveKsn7HncJRvv501qIacKzQlO/+Lwxc9+0q2wLyv4Dfvt80/DPn2pqOBsJdDiogXGR9+OvwRw==",
|
||||
"requires": {
|
||||
"has-flag": "^4.0.0"
|
||||
},
|
||||
"dependencies": {
|
||||
"has-flag": {
|
||||
"version": "4.0.0",
|
||||
"resolved": "https://registry.npmjs.org/has-flag/-/has-flag-4.0.0.tgz",
|
||||
"integrity": "sha512-EykJT/Q1KjTWctppgIAgfSO0tKVuZUjhgMr17kqTumMl6Afv3EISleU7qZUzoXDFTAHTDC4NOoG/ZxU3EvlMPQ=="
|
||||
}
|
||||
}
|
||||
},
|
||||
"tar": {
|
||||
"version": "6.1.9",
|
||||
|
||||
@@ -10,7 +10,7 @@
|
||||
"author": "",
|
||||
"license": "ISC",
|
||||
"dependencies": {
|
||||
"node-sass": "^6.0.1",
|
||||
"node-sass": "^7.0.0",
|
||||
"tar": "^6.1.9",
|
||||
"trim-newlines": "^3.0.1"
|
||||
}
|
||||
|
||||
@@ -45,11 +45,13 @@ class ChangeDetectionStore:
|
||||
'base_url' : None,
|
||||
'extract_title_as_title': False,
|
||||
'fetch_backend': 'html_requests',
|
||||
'global_ignore_text': [], # List of text to ignore when calculating the comparison checksum
|
||||
'ignore_whitespace': False,
|
||||
'notification_urls': [], # Apprise URL list
|
||||
# Custom notification content
|
||||
'notification_title': None,
|
||||
'notification_body': None,
|
||||
'notification_format': None
|
||||
'notification_title': default_notification_title,
|
||||
'notification_body': default_notification_body,
|
||||
'notification_format': default_notification_format,
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -70,13 +72,15 @@ class ChangeDetectionStore:
|
||||
'previous_md5': "",
|
||||
'uuid': str(uuid_builder.uuid4()),
|
||||
'headers': {}, # Extra headers to send
|
||||
'body': None,
|
||||
'method': 'GET',
|
||||
'history': {}, # Dict of timestamp and output stripped filename
|
||||
'ignore_text': [], # List of text to ignore when calculating the comparison checksum
|
||||
# Custom notification content
|
||||
'notification_urls': [], # List of URLs to add to the notification Queue (Usually AppRise)
|
||||
'notification_title': None,
|
||||
'notification_body': None,
|
||||
'notification_format': None,
|
||||
'notification_title': default_notification_title,
|
||||
'notification_body': default_notification_body,
|
||||
'notification_format': default_notification_format,
|
||||
'css_filter': "",
|
||||
'trigger_text': [], # List of text or regex to wait for until a change is detected
|
||||
'fetch_backend': None,
|
||||
@@ -129,7 +133,7 @@ class ChangeDetectionStore:
|
||||
self.add_watch(url='http://www.quotationspage.com/random.php', tag='test')
|
||||
self.add_watch(url='https://news.ycombinator.com/', tag='Tech news')
|
||||
self.add_watch(url='https://www.gov.uk/coronavirus', tag='Covid')
|
||||
self.add_watch(url='https://changedetection.io', tag='Tech news')
|
||||
self.add_watch(url='https://changedetection.io/CHANGELOG.txt')
|
||||
|
||||
self.__data['version_tag'] = version_tag
|
||||
|
||||
@@ -297,10 +301,10 @@ class ChangeDetectionStore:
|
||||
del_timestamps.append(timestamp)
|
||||
changes_removed += 1
|
||||
|
||||
if not limit_timestamp:
|
||||
self.data['watching'][uuid]['last_checked'] = 0
|
||||
self.data['watching'][uuid]['last_changed'] = 0
|
||||
self.data['watching'][uuid]['previous_md5'] = 0
|
||||
if not limit_timestamp:
|
||||
self.data['watching'][uuid]['last_checked'] = 0
|
||||
self.data['watching'][uuid]['last_changed'] = 0
|
||||
self.data['watching'][uuid]['previous_md5'] = ""
|
||||
|
||||
|
||||
for timestamp in del_timestamps:
|
||||
@@ -319,13 +323,13 @@ class ChangeDetectionStore:
|
||||
content = fp.read()
|
||||
self.data['watching'][uuid]['previous_md5'] = hashlib.md5(content).hexdigest()
|
||||
except (FileNotFoundError, IOError):
|
||||
self.data['watching'][uuid]['previous_md5'] = False
|
||||
self.data['watching'][uuid]['previous_md5'] = ""
|
||||
pass
|
||||
|
||||
self.needs_write = True
|
||||
return changes_removed
|
||||
|
||||
def add_watch(self, url, tag, extras=None):
|
||||
def add_watch(self, url, tag="", extras=None):
|
||||
if extras is None:
|
||||
extras = {}
|
||||
|
||||
@@ -364,7 +368,13 @@ class ChangeDetectionStore:
|
||||
import uuid
|
||||
|
||||
output_path = "{}/{}".format(self.datastore_path, watch_uuid)
|
||||
fname = "{}/{}.stripped.txt".format(output_path, uuid.uuid4())
|
||||
# Incase the operator deleted it, check and create.
|
||||
if not os.path.isdir(output_path):
|
||||
mkdir(output_path)
|
||||
|
||||
suffix = "stripped.txt"
|
||||
|
||||
fname = "{}/{}.{}".format(output_path, uuid.uuid4(), suffix)
|
||||
with open(fname, 'wb') as f:
|
||||
f.write(contents)
|
||||
f.close()
|
||||
|
||||
@@ -10,9 +10,13 @@
|
||||
AWS SNS - sns://AccessKeyID/AccessSecretKey/RegionName/+PhoneNo
|
||||
SMTPS - mailtos://user:pass@mail.domain.com?to=receivingAddress@example.com")
|
||||
}}
|
||||
<div class="pure-form-message-inline">Use <a target=_new
|
||||
href="https://github.com/caronc/apprise">AppRise
|
||||
URLs</a> for notification to just about any service! <i><a target=_new href="https://github.com/dgtlmoon/changedetection.io/wiki/Notification-configuration-notes">Please read the notification services wiki here for important configuration notes</a></i>
|
||||
<div class="pure-form-message-inline">
|
||||
<ul>
|
||||
<li>Use <a target=_new href="https://github.com/caronc/apprise">AppRise URLs</a> for notification to just about any service! <i><a target=_new href="https://github.com/dgtlmoon/changedetection.io/wiki/Notification-configuration-notes">Please read the notification services wiki here for important configuration notes</a></i>.</li>
|
||||
<li><code>discord://</code> will silently fail if the total message length is more than 2000 chars.</li>
|
||||
<li><code>tgram://</code> bots cant send messages to other bots, so you should specify chat ID of non-bot user.</li>
|
||||
<li>Go here for <a href="{{url_for('notification_logs')}}">Notification debug logs</a></li>
|
||||
</ul>
|
||||
</div>
|
||||
</div>
|
||||
<div id="notification-customisation">
|
||||
|
||||
59
changedetectionio/templates/diff-image.html
Normal file
59
changedetectionio/templates/diff-image.html
Normal file
@@ -0,0 +1,59 @@
|
||||
{% extends 'base.html' %}
|
||||
|
||||
{% block content %}
|
||||
|
||||
<div id="settings">
|
||||
<h1>Differences</h1>
|
||||
<form class="pure-form " action="" method="GET">
|
||||
<fieldset>
|
||||
{% if versions|length >= 1 %}
|
||||
<label for="diff-version">Compare newest (<span id="current-v-date"></span>) with</label>
|
||||
<select id="diff-version" name="previous_version">
|
||||
{% for version in versions %}
|
||||
<option value="{{version}}" {% if version== current_previous_version %} selected="" {% endif %}>
|
||||
{{version}}
|
||||
</option>
|
||||
{% endfor %}
|
||||
</select>
|
||||
<button type="submit" class="pure-button pure-button-primary">Go</button>
|
||||
{% endif %}
|
||||
</fieldset>
|
||||
</form>
|
||||
|
||||
</div>
|
||||
|
||||
<div id="diff-ui">
|
||||
<img style="max-width: 100%" src="{{ url_for('render_diff_image', uuid=uuid, compare_date=current_previous_version) }}" />
|
||||
|
||||
<div>
|
||||
<span style="width: 50%">
|
||||
<img style="max-width: 100%" src="{{ url_for('show_single_image', uuid=uuid, datestr=newest_version_timestamp) }}" />
|
||||
</span>
|
||||
<span style="width: 50%">
|
||||
<img style="max-width: 100%" src="{{ url_for('show_single_image', uuid=uuid, datestr=current_previous_version) }}" />
|
||||
</span>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
|
||||
<script type="text/javascript" src="{{url_for('static_content', group='js', filename='diff.js')}}"></script>
|
||||
|
||||
<script defer="">
|
||||
window.onload = function() {
|
||||
/* Set current version date as local time in the browser also */
|
||||
var current_v = document.getElementById("current-v-date");
|
||||
var dateObject = new Date({{ newest_version_timestamp }}*1000);
|
||||
current_v.innerHTML=dateObject.toLocaleString();
|
||||
|
||||
/* Convert what is options from UTC time.time() to local browser time */
|
||||
var diffList=document.getElementById("diff-version");
|
||||
if (typeof(diffList) != 'undefined' && diffList != null) {
|
||||
for (var option of diffList.options) {
|
||||
var dateObject = new Date(option.value*1000);
|
||||
option.label=dateObject.toLocaleString();
|
||||
}
|
||||
}
|
||||
}
|
||||
</script>
|
||||
|
||||
{% endblock %}
|
||||
@@ -9,9 +9,9 @@
|
||||
<div class="tabs">
|
||||
<ul>
|
||||
<li class="tab" id="default-tab"><a href="#general">General</a></li>
|
||||
<li class="tab"><a href="#request">Request</a></li>
|
||||
<li class="tab"><a href="#filters-and-triggers">Filters & Triggers</a></li>
|
||||
<li class="tab"><a href="#notifications">Notifications</a></li>
|
||||
<li class="tab"><a href="#filters">Filters</a></li>
|
||||
<li class="tab"><a href="#triggers">Triggers</a></li>
|
||||
</ul>
|
||||
</div>
|
||||
|
||||
@@ -23,6 +23,7 @@
|
||||
<fieldset>
|
||||
<div class="pure-control-group">
|
||||
{{ render_field(form.url, placeholder="https://...", required=true, class="m-d") }}
|
||||
<span class="pure-form-message-inline">Some sites use JavaScript to create the content, for this you should <a href="https://github.com/dgtlmoon/changedetection.io/wiki/Fetching-pages-with-WebDriver">use the Chrome/WebDriver Fetcher</a></span>
|
||||
</div>
|
||||
<div class="pure-control-group">
|
||||
{{ render_field(form.title, class="m-d") }}
|
||||
@@ -41,29 +42,43 @@
|
||||
href="{{ url_for('settings_page', uuid=uuid) }}">default global settings</a>.</span>
|
||||
{% endif %}
|
||||
</div>
|
||||
<fieldset class="pure-group">
|
||||
{{ render_field(form.headers, rows=5, placeholder="Example
|
||||
Cookie: foobar
|
||||
User-Agent: wonderbra 1.0") }}
|
||||
<span class="pure-form-message-inline">
|
||||
Note: ONLY used by Basic fast Plaintext/HTTP Client
|
||||
</span>
|
||||
</fieldset>
|
||||
<div class="pure-control-group">
|
||||
{{ render_field(form.fetch_backend) }}
|
||||
<span class="pure-form-message-inline">
|
||||
<p>Use the <strong>Basic</strong> method (default) where your watched sites don't need Javascript to render.</p>
|
||||
<p>The <strong>Chrome/Javascript</strong> method requires a network connection to a running WebDriver+Chrome server, set by the ENV var 'WEBDRIVER_URL'. </p>
|
||||
</span>
|
||||
</div>
|
||||
<div class="pure-control-group">
|
||||
{{ render_field(form.extract_title_as_title) }}
|
||||
</div>
|
||||
</fieldset>
|
||||
</div>
|
||||
|
||||
<div class="tab-pane-inner" id="request">
|
||||
<div class="pure-control-group">
|
||||
{{ render_field(form.fetch_backend) }}
|
||||
<span class="pure-form-message-inline">
|
||||
<p>Use the <strong>Basic</strong> method (default) where your watched site doesn't need Javascript to render.</p>
|
||||
<p>The <strong>Chrome/Javascript</strong> method requires a network connection to a running WebDriver+Chrome server, set by the ENV var 'WEBDRIVER_URL'. </p>
|
||||
</span>
|
||||
</div>
|
||||
|
||||
<fieldset class="pure-group">
|
||||
<div class="pure-control-group">
|
||||
{{ render_field(form.method) }}
|
||||
</div>
|
||||
<strong>Note: <i>Request Headers and Body settings are ONLY used by Basic fast Plaintext/HTTP Client fetch method.</i></strong>
|
||||
{{ render_field(form.headers, rows=5, placeholder="Example
|
||||
Cookie: foobar
|
||||
User-Agent: wonderbra 1.0") }}
|
||||
</fieldset>
|
||||
<div class="pure-control-group">
|
||||
{{ render_field(form.body, rows=5, placeholder="Example
|
||||
{
|
||||
\"name\":\"John\",
|
||||
\"age\":30,
|
||||
\"car\":null
|
||||
}") }}
|
||||
</div>
|
||||
|
||||
</div>
|
||||
|
||||
<div class="tab-pane-inner" id="notifications">
|
||||
<strong>Note: <i>These settings override the global settings.</i></strong>
|
||||
<strong>Note: <i>These settings override the global settings for this watch.</i></strong>
|
||||
<fieldset>
|
||||
<div class="field-group">
|
||||
{{ render_common_settings_form(form, current_base_url) }}
|
||||
@@ -71,7 +86,7 @@ User-Agent: wonderbra 1.0") }}
|
||||
</fieldset>
|
||||
</div>
|
||||
|
||||
<div class="tab-pane-inner" id="filters">
|
||||
<div class="tab-pane-inner" id="filters-and-triggers">
|
||||
<fieldset>
|
||||
<div class="pure-control-group">
|
||||
{{ render_field(form.css_filter, placeholder=".class-name or #some-id, or other CSS selector rule.",
|
||||
@@ -81,8 +96,10 @@ User-Agent: wonderbra 1.0") }}
|
||||
<li>CSS - Limit text to this CSS rule, only text matching this CSS rule is included.</li>
|
||||
<li>JSON - Limit text to this JSON rule, using <a href="https://pypi.org/project/jsonpath-ng/">JSONPath</a>, prefix with <b>"json:"</b>, <a
|
||||
href="https://jsonpath.com/" target="new">test your JSONPath here</a></li>
|
||||
<li>XPATH - Limit text to this XPath rule, simply start with a forward-slash, example <b>//*[contains(@class, 'sametext')]</b>, <a
|
||||
href="http://xpather.com/" target="new">test your XPath here</a></li>
|
||||
</ul>
|
||||
Please be sure that you thoroughly understand how to write CSS or JSONPath selector rules before filing an issue on GitHub! <a
|
||||
Please be sure that you thoroughly understand how to write CSS or JSONPath, XPath selector rules before filing an issue on GitHub! <a
|
||||
href="https://github.com/dgtlmoon/changedetection.io/wiki/CSS-Selector-help">here for more CSS selector help</a>.<br/>
|
||||
</span>
|
||||
</div>
|
||||
@@ -93,26 +110,31 @@ User-Agent: wonderbra 1.0") }}
|
||||
/some.regex\d{2}/ for case-INsensitive regex
|
||||
") }}
|
||||
<span class="pure-form-message-inline">
|
||||
Each line processed separately, any line matching will be ignored.<br/>
|
||||
Regular Expression support, wrap the line in forward slash <b>/regex/</b>.
|
||||
<ul>
|
||||
<li>Each line processed separately, any line matching will be ignored (removed before creating the checksum)</li>
|
||||
<li>Regular Expression support, wrap the line in forward slash <b>/regex/</b></li>
|
||||
<li>Changing this will affect the comparison checksum which may trigger an alert</li>
|
||||
</ul>
|
||||
</span>
|
||||
|
||||
</fieldset>
|
||||
</div>
|
||||
|
||||
<div class="tab-pane-inner" id="triggers">
|
||||
<fieldset>
|
||||
<div class="pure-control-group">
|
||||
{{ render_field(form.trigger_text, rows=5, placeholder="Some text to wait for in a line
|
||||
/some.regex\d{2}/ for case-INsensitive regex
|
||||
") }}</br>
|
||||
<span class="pure-form-message-inline">Text to wait for before triggering a change/notification, all text and regex are tested <i>case-insensitive</i>.</span><br/>
|
||||
<span class="pure-form-message-inline">Trigger text is processed from the result-text that comes out of any <a href="#filters">CSS/JSON Filters</a> for this watch</span>.<br/>
|
||||
<span class="pure-form-message-inline">Each line is process separately (think of each line as "OR")</span><br/>
|
||||
<span class="pure-form-message-inline">Note: Wrap in forward slash / to use regex example: <span style="font-family: monospace; background: #eee">/foo\d/</span> </span>
|
||||
") }}
|
||||
<span class="pure-form-message-inline">
|
||||
<ul>
|
||||
<li>Text to wait for before triggering a change/notification, all text and regex are tested <i>case-insensitive</i>.</li>
|
||||
<li>Trigger text is processed from the result-text that comes out of any CSS/JSON Filters for this watch</li>
|
||||
<li>Each line is process separately (think of each line as "OR")</li>
|
||||
<li>Note: Wrap in forward slash / to use regex example: <span style="font-family: monospace; background: #eee">/foo\d/</span></li>
|
||||
</ul>
|
||||
</span>
|
||||
</div>
|
||||
</fieldset>
|
||||
</div>
|
||||
|
||||
<div id="actions">
|
||||
<div class="pure-control-group">
|
||||
|
||||
|
||||
@@ -5,7 +5,14 @@
|
||||
<div class="inner">
|
||||
<form class="pure-form pure-form-aligned" action="{{url_for('import_page')}}" method="POST">
|
||||
<fieldset class="pure-group">
|
||||
<legend>One URL per line, URLs that do not pass validation will stay in the textarea.</legend>
|
||||
<legend>
|
||||
Enter one URL per line, and optionally add tags for each URL after a space, delineated by comma (,):
|
||||
<br>
|
||||
<code>https://example.com tag1, tag2, last tag</code>
|
||||
<br>
|
||||
URLs which do not pass validation will stay in the textarea.
|
||||
</legend>
|
||||
|
||||
|
||||
<textarea name="urls" class="pure-input-1-2" placeholder="https://"
|
||||
style="width: 100%;
|
||||
@@ -20,4 +27,3 @@
|
||||
</div>
|
||||
|
||||
{% endblock %}
|
||||
|
||||
|
||||
19
changedetectionio/templates/notification-log.html
Normal file
19
changedetectionio/templates/notification-log.html
Normal file
@@ -0,0 +1,19 @@
|
||||
{% extends 'base.html' %}
|
||||
|
||||
{% block content %}
|
||||
<div class="edit-form">
|
||||
<div class="inner">
|
||||
|
||||
<h4 style="margin-top: 0px;">The following issues were detected when sending notifications</h4>
|
||||
<div id="notification-customisation">
|
||||
<ul style="font-size: 80%; margin:0px; padding: 0 0 0 7px">
|
||||
{% for log in logs|reverse %}
|
||||
<li>{{log}}</li>
|
||||
{% endfor %}
|
||||
</ul>
|
||||
</div>
|
||||
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{% endblock %}
|
||||
13
changedetectionio/templates/preview-image.html
Normal file
13
changedetectionio/templates/preview-image.html
Normal file
@@ -0,0 +1,13 @@
|
||||
{% extends 'base.html' %}
|
||||
|
||||
{% block content %}
|
||||
|
||||
<div id="settings">
|
||||
<h1>Current</h1>
|
||||
</div>
|
||||
|
||||
<div id="diff-ui">
|
||||
image goes here
|
||||
</div>
|
||||
|
||||
{% endblock %}
|
||||
@@ -6,21 +6,16 @@
|
||||
<h1>Current</h1>
|
||||
</div>
|
||||
|
||||
|
||||
<div id="diff-ui">
|
||||
|
||||
<table>
|
||||
<tbody>
|
||||
<tr>
|
||||
<!-- just proof of concept copied straight from github.com/kpdecker/jsdiff -->
|
||||
|
||||
<td id="diff-col">
|
||||
<span id="result">{% for row in content %}<pre>{{row}}</pre>{% endfor %}</span>
|
||||
<span id="result">{{content}}</span>
|
||||
</td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
</div>
|
||||
|
||||
|
||||
{% endblock %}
|
||||
@@ -13,6 +13,7 @@
|
||||
<li class="tab" id="default-tab"><a href="#general">General</a></li>
|
||||
<li class="tab"><a href="#notifications">Notifications</a></li>
|
||||
<li class="tab"><a href="#fetching">Fetching</a></li>
|
||||
<li class="tab"><a href="#filters">Global Filters</a></li>
|
||||
</ul>
|
||||
</div>
|
||||
<div class="box-wrap inner">
|
||||
@@ -24,19 +25,23 @@
|
||||
<span class="pure-form-message-inline">Default time for all watches, when the watch does not have a specific time setting.</span>
|
||||
</div>
|
||||
<div class="pure-control-group">
|
||||
{% if current_user.is_authenticated %}
|
||||
<a href="{{url_for('settings_page', removepassword='yes')}}"
|
||||
class="pure-button pure-button-primary">Remove password</a>
|
||||
{% if not hide_remove_pass %}
|
||||
{% if current_user.is_authenticated %}
|
||||
<a href="{{url_for('settings_page', removepassword='yes')}}"
|
||||
class="pure-button pure-button-primary">Remove password</a>
|
||||
{% else %}
|
||||
{{ render_field(form.password) }}
|
||||
<span class="pure-form-message-inline">Password protection for your changedetection.io application.</span>
|
||||
{% endif %}
|
||||
{% else %}
|
||||
{{ render_field(form.password) }}
|
||||
<span class="pure-form-message-inline">Password protection for your changedetection.io application.</span>
|
||||
<span class="pure-form-message-inline">Password is locked.</span>
|
||||
{% endif %}
|
||||
</div>
|
||||
<div class="pure-control-group">
|
||||
{{ render_field(form.base_url, placeholder="http://yoursite.com:5000/",
|
||||
class="m-d") }}
|
||||
<span class="pure-form-message-inline">
|
||||
Base URL used for the {base_url} token in notifications, default value is the ENV var 'BASE_URL' (Currently "{{current_base_url}}"),
|
||||
Base URL used for the {base_url} token in notifications and RSS links.<br/>Default value is the ENV var 'BASE_URL' (Currently "{{current_base_url}}"),
|
||||
<a href="https://github.com/dgtlmoon/changedetection.io/wiki/Configurable-BASE_URL-setting">read more here</a>.
|
||||
</span>
|
||||
</div>
|
||||
@@ -54,6 +59,8 @@
|
||||
{{ render_common_settings_form(form, current_base_url) }}
|
||||
</div>
|
||||
</fieldset>
|
||||
<a href="{{url_for('notification_logs')}}">Notification debug logs</a>
|
||||
|
||||
</div>
|
||||
|
||||
<div class="tab-pane-inner" id="fetching">
|
||||
@@ -65,6 +72,34 @@
|
||||
</span>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
|
||||
<div class="tab-pane-inner" id="filters">
|
||||
|
||||
<fieldset class="pure-group">
|
||||
{{ render_field(form.ignore_whitespace) }}
|
||||
<span class="pure-form-message-inline">Ignore whitespace, tabs and new-lines/line-feeds when considering if a change was detected.<br/>
|
||||
<i>Note:</i> Changing this will change the status of your existing watches, possibily trigger alerts etc.
|
||||
</span>
|
||||
</fieldset>
|
||||
|
||||
|
||||
<fieldset class="pure-group">
|
||||
{{ render_field(form.global_ignore_text, rows=5, placeholder="Some text to ignore in a line
|
||||
/some.regex\d{2}/ for case-INsensitive regex
|
||||
") }}
|
||||
<span class="pure-form-message-inline">Note: This is applied globally in addition to the per-watch rules.</span><br/>
|
||||
<span class="pure-form-message-inline">
|
||||
<ul>
|
||||
<li>Note: This is applied globally in addition to the per-watch rules.</li>
|
||||
<li>Each line processed separately, any line matching will be ignored (removed before creating the checksum)</li>
|
||||
<li>Regular Expression support, wrap the line in forward slash <b>/regex/</b></li>
|
||||
<li>Changing this will affect the comparison checksum which may trigger an alert</li>
|
||||
</ul>
|
||||
</span>
|
||||
</fieldset>
|
||||
</div>
|
||||
|
||||
<div id="actions">
|
||||
<div class="pure-control-group">
|
||||
<button type="submit" class="pure-button pure-button-primary">Save</button>
|
||||
|
||||
@@ -42,6 +42,7 @@
|
||||
<tr id="{{ watch.uuid }}"
|
||||
class="{{ loop.cycle('pure-table-odd', 'pure-table-even') }}
|
||||
{% if watch.last_error is defined and watch.last_error != False %}error{% endif %}
|
||||
{% if watch.last_notification_error is defined and watch.last_notification_error != False %}error{% endif %}
|
||||
{% if watch.paused is defined and watch.paused != False %}paused{% endif %}
|
||||
{% if watch.newest_history_key| int > watch.last_viewed| int %}unviewed{% endif %}">
|
||||
<td class="inline">{{ loop.index }}</td>
|
||||
@@ -49,11 +50,14 @@
|
||||
|
||||
<td class="title-col inline">{{watch.title if watch.title is not none and watch.title|length > 0 else watch.url}}
|
||||
<a class="external" target="_blank" rel="noopener" href="{{ watch.url }}"></a>
|
||||
{%if watch.fetch_backend == "html_webdriver" %}<img style="height: 1em; display:inline-block;" src="static/images/Google-Chrome-icon.png" />{% endif %}
|
||||
{%if watch.fetch_backend == "html_webdriver" %}<img style="height: 1em; display:inline-block;" src="{{url_for('static_content', group='images', filename='Google-Chrome-icon.png')}}" />{% endif %}
|
||||
|
||||
{% if watch.last_error is defined and watch.last_error != False %}
|
||||
<div class="fetch-error">{{ watch.last_error }}</div>
|
||||
{% endif %}
|
||||
{% if watch.last_notification_error is defined and watch.last_notification_error != False %}
|
||||
<div class="fetch-error notification-error">{{ watch.last_notification_error }}</div>
|
||||
{% endif %}
|
||||
{% if not active_tag %}
|
||||
<span class="watch-tag-list">{{ watch.tag}}</span>
|
||||
{% endif %}
|
||||
|
||||
@@ -18,7 +18,8 @@ def cleanup(datastore_path):
|
||||
'url-watches.json',
|
||||
'notification.txt',
|
||||
'count.txt',
|
||||
'endpoint-content.txt']
|
||||
'endpoint-content.txt'
|
||||
]
|
||||
for file in files:
|
||||
try:
|
||||
os.unlink("{}/{}".format(datastore_path, file))
|
||||
|
||||
74
changedetectionio/tests/test_api.py
Normal file
74
changedetectionio/tests/test_api.py
Normal file
@@ -0,0 +1,74 @@
|
||||
#!/usr/bin/python3
|
||||
|
||||
import time
|
||||
from flask import url_for
|
||||
from . util import live_server_setup
|
||||
|
||||
def test_setup(live_server):
|
||||
live_server_setup(live_server)
|
||||
|
||||
|
||||
def set_response_data(test_return_data):
|
||||
with open("test-datastore/endpoint-content.txt", "w") as f:
|
||||
f.write(test_return_data)
|
||||
|
||||
|
||||
def test_snapshot_api_detects_change(client, live_server):
|
||||
|
||||
test_return_data = "Some initial text"
|
||||
|
||||
test_return_data_modified = "Some NEW nice initial text"
|
||||
|
||||
sleep_time_for_fetch_thread = 3
|
||||
|
||||
set_response_data(test_return_data)
|
||||
|
||||
# Give the endpoint time to spin up
|
||||
time.sleep(1)
|
||||
|
||||
# Add our URL to the import page
|
||||
test_url = url_for('test_endpoint', _external=True)
|
||||
res = client.post(
|
||||
url_for("import_page"),
|
||||
data={"urls": test_url},
|
||||
follow_redirects=True
|
||||
)
|
||||
assert b"1 Imported" in res.data
|
||||
|
||||
# Trigger a check
|
||||
client.get(url_for("api_watch_checknow"), follow_redirects=True)
|
||||
|
||||
# Give the thread time to pick it up
|
||||
time.sleep(sleep_time_for_fetch_thread)
|
||||
|
||||
res = client.get(
|
||||
url_for("api_snapshot", uuid="first"),
|
||||
follow_redirects=True
|
||||
)
|
||||
|
||||
assert test_return_data.encode() == res.data
|
||||
|
||||
# Make a change
|
||||
set_response_data(test_return_data_modified)
|
||||
|
||||
# Trigger a check
|
||||
client.get(url_for("api_watch_checknow"), follow_redirects=True)
|
||||
# Give the thread time to pick it up
|
||||
time.sleep(sleep_time_for_fetch_thread)
|
||||
|
||||
res = client.get(
|
||||
url_for("api_snapshot", uuid="first"),
|
||||
follow_redirects=True
|
||||
)
|
||||
|
||||
assert test_return_data_modified.encode() == res.data
|
||||
|
||||
def test_snapshot_api_invalid_uuid(client, live_server):
|
||||
|
||||
res = client.get(
|
||||
url_for("api_snapshot", uuid="invalid"),
|
||||
follow_redirects=True
|
||||
)
|
||||
|
||||
assert res.status_code == 400
|
||||
|
||||
39
changedetectionio/tests/test_auth.py
Normal file
39
changedetectionio/tests/test_auth.py
Normal file
@@ -0,0 +1,39 @@
|
||||
#!/usr/bin/python3
|
||||
|
||||
import time
|
||||
from flask import url_for
|
||||
from . util import live_server_setup
|
||||
|
||||
def test_basic_auth(client, live_server):
|
||||
|
||||
live_server_setup(live_server)
|
||||
# Give the endpoint time to spin up
|
||||
time.sleep(1)
|
||||
|
||||
# Add our URL to the import page
|
||||
test_url = url_for('test_basicauth_method', _external=True).replace("//","//myuser:mypass@")
|
||||
|
||||
res = client.post(
|
||||
url_for("import_page"),
|
||||
data={"urls": test_url},
|
||||
follow_redirects=True
|
||||
)
|
||||
assert b"1 Imported" in res.data
|
||||
|
||||
# Check form validation
|
||||
res = client.post(
|
||||
url_for("edit_page", uuid="first"),
|
||||
data={"css_filter": "", "url": test_url, "tag": "", "headers": "", 'fetch_backend': "html_requests"},
|
||||
follow_redirects=True
|
||||
)
|
||||
assert b"Updated watch." in res.data
|
||||
|
||||
# Trigger a check
|
||||
client.get(url_for("api_watch_checknow"), follow_redirects=True)
|
||||
time.sleep(1)
|
||||
res = client.get(
|
||||
url_for("preview_page", uuid="first"),
|
||||
follow_redirects=True
|
||||
)
|
||||
|
||||
assert b'myuser mypass basic' in res.data
|
||||
@@ -50,7 +50,7 @@ def test_check_basic_change_detection_functionality(client, live_server):
|
||||
|
||||
# Force recheck
|
||||
res = client.get(url_for("api_watch_checknow"), follow_redirects=True)
|
||||
assert b'1 watches are rechecking.' in res.data
|
||||
assert b'1 watches are queued for rechecking.' in res.data
|
||||
|
||||
time.sleep(sleep_time_for_fetch_thread)
|
||||
|
||||
@@ -100,6 +100,14 @@ def test_check_basic_change_detection_functionality(client, live_server):
|
||||
# It should have picked up the <title>
|
||||
assert b'head title' in res.data
|
||||
|
||||
|
||||
# be sure the HTML converter worked
|
||||
res = client.get(url_for("preview_page", uuid="first"))
|
||||
assert b'<html>' not in res.data
|
||||
|
||||
res = client.get(url_for("preview_page", uuid="first"))
|
||||
assert b'Some initial text' in res.data
|
||||
|
||||
#
|
||||
# Cleanup everything
|
||||
res = client.get(url_for("api_delete", uuid="all"), follow_redirects=True)
|
||||
|
||||
25
changedetectionio/tests/test_backup.py
Normal file
25
changedetectionio/tests/test_backup.py
Normal file
@@ -0,0 +1,25 @@
|
||||
#!/usr/bin/python3
|
||||
|
||||
import time
|
||||
from flask import url_for
|
||||
from urllib.request import urlopen
|
||||
from . util import set_original_response, set_modified_response, live_server_setup
|
||||
|
||||
|
||||
def test_backup(client, live_server):
|
||||
|
||||
live_server_setup(live_server)
|
||||
|
||||
# Give the endpoint time to spin up
|
||||
time.sleep(1)
|
||||
|
||||
res = client.get(
|
||||
url_for("get_backup"),
|
||||
follow_redirects=True
|
||||
)
|
||||
|
||||
# Should get the right zip content type
|
||||
assert res.content_type == "application/zip"
|
||||
# Should be PK/ZIP stream
|
||||
assert res.data.count(b'PK') >= 2
|
||||
|
||||
56
changedetectionio/tests/test_binary_fetch.py
Normal file
56
changedetectionio/tests/test_binary_fetch.py
Normal file
@@ -0,0 +1,56 @@
|
||||
#!/usr/bin/python3
|
||||
|
||||
import time
|
||||
import secrets
|
||||
from flask import url_for
|
||||
from . util import live_server_setup
|
||||
|
||||
|
||||
def test_binary_file_change(client, live_server):
|
||||
with open("test-datastore/test.bin", "wb") as f:
|
||||
f.write(secrets.token_bytes())
|
||||
|
||||
live_server_setup(live_server)
|
||||
|
||||
sleep_time_for_fetch_thread = 3
|
||||
|
||||
# Give the endpoint time to spin up
|
||||
time.sleep(1)
|
||||
|
||||
# Add our URL to the import page
|
||||
test_url = url_for('test_binaryfile_endpoint', _external=True)
|
||||
res = client.post(
|
||||
url_for("import_page"),
|
||||
data={"urls": test_url},
|
||||
follow_redirects=True
|
||||
)
|
||||
assert b"1 Imported" in res.data
|
||||
|
||||
# Trigger a check
|
||||
client.get(url_for("api_watch_checknow"), follow_redirects=True)
|
||||
|
||||
# Give the thread time to pick it up
|
||||
time.sleep(sleep_time_for_fetch_thread)
|
||||
|
||||
# Trigger a check
|
||||
client.get(url_for("api_watch_checknow"), follow_redirects=True)
|
||||
|
||||
# It should report nothing found (no new 'unviewed' class)
|
||||
res = client.get(url_for("index"))
|
||||
assert b'unviewed' not in res.data
|
||||
assert b'/test-binary-endpoint' in res.data
|
||||
|
||||
# Make a change
|
||||
with open("test-datastore/test.bin", "wb") as f:
|
||||
f.write(secrets.token_bytes())
|
||||
|
||||
|
||||
# Trigger a check
|
||||
client.get(url_for("api_watch_checknow"), follow_redirects=True)
|
||||
|
||||
# Give the thread time to pick it up
|
||||
time.sleep(sleep_time_for_fetch_thread)
|
||||
|
||||
# It should report nothing found (no new 'unviewed' class)
|
||||
res = client.get(url_for("index"))
|
||||
assert b'unviewed' in res.data
|
||||
62
changedetectionio/tests/test_errorhandling.py
Normal file
62
changedetectionio/tests/test_errorhandling.py
Normal file
@@ -0,0 +1,62 @@
|
||||
#!/usr/bin/python3
|
||||
|
||||
import time
|
||||
from flask import url_for
|
||||
from . util import live_server_setup
|
||||
|
||||
from ..html_tools import *
|
||||
|
||||
def test_setup(live_server):
|
||||
live_server_setup(live_server)
|
||||
|
||||
|
||||
def test_error_handler(client, live_server):
|
||||
|
||||
|
||||
# Give the endpoint time to spin up
|
||||
time.sleep(1)
|
||||
|
||||
# Add our URL to the import page
|
||||
test_url = url_for('test_endpoint_403_error', _external=True)
|
||||
res = client.post(
|
||||
url_for("import_page"),
|
||||
data={"urls": test_url},
|
||||
follow_redirects=True
|
||||
)
|
||||
assert b"1 Imported" in res.data
|
||||
|
||||
# Trigger a check
|
||||
client.get(url_for("api_watch_checknow"), follow_redirects=True)
|
||||
|
||||
# Give the thread time to pick it up
|
||||
time.sleep(3)
|
||||
|
||||
|
||||
res = client.get(url_for("index"))
|
||||
assert b'unviewed' not in res.data
|
||||
assert b'Status Code 403' in res.data
|
||||
assert bytes("just now".encode('utf-8')) in res.data
|
||||
|
||||
# Just to be sure error text is properly handled
|
||||
def test_error_text_handler(client, live_server):
|
||||
# Give the endpoint time to spin up
|
||||
time.sleep(1)
|
||||
|
||||
# Add our URL to the import page
|
||||
res = client.post(
|
||||
url_for("import_page"),
|
||||
data={"urls": "https://errorfuldomainthatnevereallyexists12356.com"},
|
||||
follow_redirects=True
|
||||
)
|
||||
assert b"1 Imported" in res.data
|
||||
|
||||
# Trigger a check
|
||||
client.get(url_for("api_watch_checknow"), follow_redirects=True)
|
||||
|
||||
# Give the thread time to pick it up
|
||||
time.sleep(3)
|
||||
|
||||
res = client.get(url_for("index"))
|
||||
assert b'Name or service not known' in res.data
|
||||
assert bytes("just now".encode('utf-8')) in res.data
|
||||
|
||||
@@ -1,80 +0,0 @@
|
||||
import json
|
||||
import time
|
||||
from flask import url_for
|
||||
from . util import set_original_response, set_modified_response, live_server_setup
|
||||
|
||||
# Hard to just add more live server URLs when one test is already running (I think)
|
||||
# So we add our test here (was in a different file)
|
||||
def test_headers_in_request(client, live_server):
|
||||
live_server_setup(live_server)
|
||||
|
||||
# Add our URL to the import page
|
||||
test_url = url_for('test_headers', _external=True)
|
||||
|
||||
# Add the test URL twice, we will check
|
||||
res = client.post(
|
||||
url_for("import_page"),
|
||||
data={"urls": test_url},
|
||||
follow_redirects=True
|
||||
)
|
||||
assert b"1 Imported" in res.data
|
||||
|
||||
res = client.post(
|
||||
url_for("import_page"),
|
||||
data={"urls": test_url},
|
||||
follow_redirects=True
|
||||
)
|
||||
assert b"1 Imported" in res.data
|
||||
|
||||
cookie_header = '_ga=GA1.2.1022228332; cookie-preferences=analytics:accepted;'
|
||||
|
||||
|
||||
# Add some headers to a request
|
||||
res = client.post(
|
||||
url_for("edit_page", uuid="first"),
|
||||
data={
|
||||
"url": test_url,
|
||||
"tag": "",
|
||||
"fetch_backend": "html_requests",
|
||||
"headers": "xxx:ooo\ncool:yeah\r\ncookie:"+cookie_header},
|
||||
follow_redirects=True
|
||||
)
|
||||
assert b"Updated watch." in res.data
|
||||
|
||||
|
||||
# Give the thread time to pick up the first version
|
||||
time.sleep(5)
|
||||
|
||||
# The service should echo back the request headers
|
||||
res = client.get(
|
||||
url_for("preview_page", uuid="first"),
|
||||
follow_redirects=True
|
||||
)
|
||||
|
||||
# Flask will convert the header key to uppercase
|
||||
assert b"Xxx:ooo" in res.data
|
||||
assert b"Cool:yeah" in res.data
|
||||
|
||||
# The test call service will return the headers as the body
|
||||
from html import escape
|
||||
assert escape(cookie_header).encode('utf-8') in res.data
|
||||
|
||||
time.sleep(5)
|
||||
|
||||
# Re #137 - Examine the JSON index file, it should have only one set of headers entered
|
||||
watches_with_headers = 0
|
||||
with open('test-datastore/url-watches.json') as f:
|
||||
app_struct = json.load(f)
|
||||
for uuid in app_struct['watching']:
|
||||
if (len(app_struct['watching'][uuid]['headers'])):
|
||||
watches_with_headers += 1
|
||||
|
||||
# Should be only one with headers set
|
||||
assert watches_with_headers==1
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
@@ -151,3 +151,88 @@ def test_check_ignore_text_functionality(client, live_server):
|
||||
|
||||
res = client.get(url_for("api_delete", uuid="all"), follow_redirects=True)
|
||||
assert b'Deleted' in res.data
|
||||
|
||||
def test_check_global_ignore_text_functionality(client, live_server):
|
||||
sleep_time_for_fetch_thread = 3
|
||||
|
||||
ignore_text = "XXXXX\r\nYYYYY\r\nZZZZZ"
|
||||
set_original_ignore_response()
|
||||
|
||||
# Give the endpoint time to spin up
|
||||
time.sleep(1)
|
||||
|
||||
# Add our URL to the import page
|
||||
test_url = url_for('test_endpoint', _external=True)
|
||||
res = client.post(
|
||||
url_for("import_page"),
|
||||
data={"urls": test_url},
|
||||
follow_redirects=True
|
||||
)
|
||||
assert b"1 Imported" in res.data
|
||||
|
||||
# Trigger a check
|
||||
client.get(url_for("api_watch_checknow"), follow_redirects=True)
|
||||
|
||||
# Give the thread time to pick it up
|
||||
time.sleep(sleep_time_for_fetch_thread)
|
||||
|
||||
# Goto the settings page, add our ignore text
|
||||
res = client.post(
|
||||
url_for("settings_page"),
|
||||
data={
|
||||
"minutes_between_check": 180,
|
||||
"global_ignore_text": ignore_text,
|
||||
'fetch_backend': "html_requests"
|
||||
},
|
||||
follow_redirects=True
|
||||
)
|
||||
assert b"Settings updated." in res.data
|
||||
|
||||
# Goto the edit page of the item, add our ignore text
|
||||
# Add our URL to the import page
|
||||
res = client.post(
|
||||
url_for("edit_page", uuid="first"),
|
||||
data={"ignore_text": "something irrelevent but just to check", "url": test_url, 'fetch_backend': "html_requests"},
|
||||
follow_redirects=True
|
||||
)
|
||||
assert b"Updated watch." in res.data
|
||||
|
||||
# Check it saved
|
||||
res = client.get(
|
||||
url_for("settings_page"),
|
||||
)
|
||||
assert bytes(ignore_text.encode('utf-8')) in res.data
|
||||
|
||||
# Trigger a check
|
||||
client.get(url_for("api_watch_checknow"), follow_redirects=True)
|
||||
|
||||
# Give the thread time to pick it up
|
||||
time.sleep(sleep_time_for_fetch_thread)
|
||||
|
||||
# It should report nothing found (no new 'unviewed' class)
|
||||
res = client.get(url_for("index"))
|
||||
assert b'unviewed' not in res.data
|
||||
assert b'/test-endpoint' in res.data
|
||||
|
||||
# Make a change
|
||||
set_modified_ignore_response()
|
||||
|
||||
# Trigger a check
|
||||
client.get(url_for("api_watch_checknow"), follow_redirects=True)
|
||||
# Give the thread time to pick it up
|
||||
time.sleep(sleep_time_for_fetch_thread)
|
||||
|
||||
# It should report nothing found (no new 'unviewed' class)
|
||||
res = client.get(url_for("index"))
|
||||
assert b'unviewed' not in res.data
|
||||
assert b'/test-endpoint' in res.data
|
||||
|
||||
# Just to be sure.. set a regular modified change..
|
||||
set_modified_original_ignore_response()
|
||||
client.get(url_for("api_watch_checknow"), follow_redirects=True)
|
||||
time.sleep(sleep_time_for_fetch_thread)
|
||||
res = client.get(url_for("index"))
|
||||
assert b'unviewed' in res.data
|
||||
|
||||
res = client.get(url_for("api_delete", uuid="all"), follow_redirects=True)
|
||||
assert b'Deleted' in res.data
|
||||
|
||||
96
changedetectionio/tests/test_ignorewhitespace.py
Normal file
96
changedetectionio/tests/test_ignorewhitespace.py
Normal file
@@ -0,0 +1,96 @@
|
||||
#!/usr/bin/python3
|
||||
|
||||
import time
|
||||
from flask import url_for
|
||||
from . util import live_server_setup
|
||||
|
||||
def test_setup(live_server):
|
||||
live_server_setup(live_server)
|
||||
|
||||
|
||||
# Should be the same as set_original_ignore_response() but with a little more whitespacing
|
||||
def set_original_ignore_response_but_with_whitespace():
|
||||
test_return_data = """<html>
|
||||
<body>
|
||||
Some initial text</br>
|
||||
<p>
|
||||
|
||||
|
||||
Which is across multiple lines</p>
|
||||
<br>
|
||||
</br>
|
||||
|
||||
So let's see what happens. </br>
|
||||
|
||||
|
||||
</body>
|
||||
</html>
|
||||
|
||||
"""
|
||||
with open("test-datastore/endpoint-content.txt", "w") as f:
|
||||
f.write(test_return_data)
|
||||
|
||||
|
||||
def set_original_ignore_response():
|
||||
test_return_data = """<html>
|
||||
<body>
|
||||
Some initial text</br>
|
||||
<p>Which is across multiple lines</p>
|
||||
</br>
|
||||
So let's see what happens. </br>
|
||||
</body>
|
||||
</html>
|
||||
|
||||
"""
|
||||
|
||||
with open("test-datastore/endpoint-content.txt", "w") as f:
|
||||
f.write(test_return_data)
|
||||
|
||||
|
||||
|
||||
# If there was only a change in the whitespacing, then we shouldnt have a change detected
|
||||
def test_check_ignore_whitespace(client, live_server):
|
||||
sleep_time_for_fetch_thread = 3
|
||||
|
||||
# Give the endpoint time to spin up
|
||||
time.sleep(1)
|
||||
|
||||
set_original_ignore_response()
|
||||
|
||||
# Goto the settings page, add our ignore text
|
||||
res = client.post(
|
||||
url_for("settings_page"),
|
||||
data={
|
||||
"minutes_between_check": 180,
|
||||
"ignore_whitespace": "y",
|
||||
'fetch_backend': "html_requests"
|
||||
},
|
||||
follow_redirects=True
|
||||
)
|
||||
assert b"Settings updated." in res.data
|
||||
|
||||
# Add our URL to the import page
|
||||
test_url = url_for('test_endpoint', _external=True)
|
||||
res = client.post(
|
||||
url_for("import_page"),
|
||||
data={"urls": test_url},
|
||||
follow_redirects=True
|
||||
)
|
||||
assert b"1 Imported" in res.data
|
||||
|
||||
time.sleep(sleep_time_for_fetch_thread)
|
||||
# Trigger a check
|
||||
client.get(url_for("api_watch_checknow"), follow_redirects=True)
|
||||
|
||||
set_original_ignore_response_but_with_whitespace()
|
||||
time.sleep(sleep_time_for_fetch_thread)
|
||||
# Trigger a check
|
||||
client.get(url_for("api_watch_checknow"), follow_redirects=True)
|
||||
|
||||
# Give the thread time to pick it up
|
||||
time.sleep(sleep_time_for_fetch_thread)
|
||||
|
||||
# It should report nothing found (no new 'unviewed' class)
|
||||
res = client.get(url_for("index"))
|
||||
assert b'unviewed' not in res.data
|
||||
assert b'/test-endpoint' in res.data
|
||||
28
changedetectionio/tests/test_import.py
Normal file
28
changedetectionio/tests/test_import.py
Normal file
@@ -0,0 +1,28 @@
|
||||
#!/usr/bin/python3
|
||||
|
||||
import time
|
||||
|
||||
from flask import url_for
|
||||
|
||||
from .util import live_server_setup
|
||||
|
||||
|
||||
def test_import(client, live_server):
|
||||
|
||||
live_server_setup(live_server)
|
||||
|
||||
# Give the endpoint time to spin up
|
||||
time.sleep(1)
|
||||
|
||||
res = client.post(
|
||||
url_for("import_page"),
|
||||
data={
|
||||
"urls": """https://example.com
|
||||
https://example.com tag1
|
||||
https://example.com tag1, other tag"""
|
||||
},
|
||||
follow_redirects=True,
|
||||
)
|
||||
assert b"3 Imported" in res.data
|
||||
assert b"tag1" in res.data
|
||||
assert b"other tag" in res.data
|
||||
@@ -111,6 +111,21 @@ def set_original_response():
|
||||
f.write(test_return_data)
|
||||
return None
|
||||
|
||||
|
||||
def set_response_with_html():
|
||||
test_return_data = """
|
||||
{
|
||||
"test": [
|
||||
{
|
||||
"html": "<b>"
|
||||
}
|
||||
]
|
||||
}
|
||||
"""
|
||||
with open("test-datastore/endpoint-content.txt", "w") as f:
|
||||
f.write(test_return_data)
|
||||
return None
|
||||
|
||||
def set_modified_response():
|
||||
test_return_data = """
|
||||
{
|
||||
@@ -138,6 +153,37 @@ def set_modified_response():
|
||||
|
||||
return None
|
||||
|
||||
def test_check_json_without_filter(client, live_server):
|
||||
# Request a JSON document from a application/json source containing HTML
|
||||
# and be sure it doesn't get chewed up by instriptis
|
||||
set_response_with_html()
|
||||
|
||||
# Give the endpoint time to spin up
|
||||
time.sleep(1)
|
||||
|
||||
# Add our URL to the import page
|
||||
test_url = url_for('test_endpoint_json', _external=True)
|
||||
client.post(
|
||||
url_for("import_page"),
|
||||
data={"urls": test_url},
|
||||
follow_redirects=True
|
||||
)
|
||||
|
||||
# Trigger a check
|
||||
client.get(url_for("api_watch_checknow"), follow_redirects=True)
|
||||
|
||||
# Give the thread time to pick it up
|
||||
time.sleep(3)
|
||||
|
||||
res = client.get(
|
||||
url_for("preview_page", uuid="first"),
|
||||
follow_redirects=True
|
||||
)
|
||||
|
||||
assert b'"<b>' in res.data
|
||||
assert res.data.count(b'{\n') >= 2
|
||||
|
||||
|
||||
def test_check_json_filter(client, live_server):
|
||||
json_filter = 'json:boss.name'
|
||||
|
||||
|
||||
@@ -4,6 +4,7 @@ import re
|
||||
from flask import url_for
|
||||
from . util import set_original_response, set_modified_response, live_server_setup
|
||||
import logging
|
||||
from changedetectionio.notification import default_notification_body, default_notification_title
|
||||
|
||||
# Hard to just add more live server URLs when one test is already running (I think)
|
||||
# So we add our test here (was in a different file)
|
||||
@@ -15,6 +16,11 @@ def test_check_notification(client, live_server):
|
||||
# Give the endpoint time to spin up
|
||||
time.sleep(3)
|
||||
|
||||
# Re 360 - new install should have defaults set
|
||||
res = client.get(url_for("settings_page"))
|
||||
assert default_notification_body.encode() in res.data
|
||||
assert default_notification_title.encode() in res.data
|
||||
|
||||
# When test mode is in BASE_URL env mode, we should see this already configured
|
||||
env_base_url = os.getenv('BASE_URL', '').strip()
|
||||
if len(env_base_url):
|
||||
@@ -117,7 +123,8 @@ def test_check_notification(client, live_server):
|
||||
assert test_url in notification_submission
|
||||
|
||||
# Diff was correctly executed
|
||||
assert "Diff Full: (changed) Which is across multiple lines" in notification_submission
|
||||
assert "Diff Full: Some initial text" in notification_submission
|
||||
assert "Diff: (changed) Which is across multiple lines" in notification_submission
|
||||
assert "(-> into) which has this one new line" in notification_submission
|
||||
|
||||
|
||||
@@ -159,6 +166,9 @@ def test_check_notification(client, live_server):
|
||||
|
||||
with open("test-datastore/notification.txt", "r") as f:
|
||||
notification_submission = f.read()
|
||||
print ("Notification submission was:", notification_submission)
|
||||
# Re #342 - check for accidental python byte encoding of non-utf8/string
|
||||
assert "b'" not in notification_submission
|
||||
|
||||
assert re.search('Watch UUID: [0-9a-f]{8}(-[0-9a-f]{4}){3}-[0-9a-f]{12}', notification_submission, re.IGNORECASE)
|
||||
assert "Watch title: my title" in notification_submission
|
||||
@@ -198,3 +208,20 @@ def test_check_notification(client, live_server):
|
||||
)
|
||||
|
||||
assert bytes("is not a valid token".encode('utf-8')) in res.data
|
||||
|
||||
# Re #360 some validation
|
||||
res = client.post(
|
||||
url_for("edit_page", uuid="first"),
|
||||
data={"notification_urls": notification_url,
|
||||
"notification_title": "",
|
||||
"notification_body": "",
|
||||
"notification_format": "Text",
|
||||
"url": test_url,
|
||||
"tag": "my tag",
|
||||
"title": "my title",
|
||||
"headers": "",
|
||||
"fetch_backend": "html_requests",
|
||||
"trigger_check": "y"},
|
||||
follow_redirects=True
|
||||
)
|
||||
assert b"Notification Body and Title is required when a Notification URL is used" in res.data
|
||||
|
||||
66
changedetectionio/tests/test_notification_errors.py
Normal file
66
changedetectionio/tests/test_notification_errors.py
Normal file
@@ -0,0 +1,66 @@
|
||||
import os
|
||||
import time
|
||||
import re
|
||||
from flask import url_for
|
||||
from . util import set_original_response, set_modified_response, live_server_setup
|
||||
import logging
|
||||
|
||||
def test_check_notification_error_handling(client, live_server):
|
||||
|
||||
live_server_setup(live_server)
|
||||
set_original_response()
|
||||
|
||||
# Give the endpoint time to spin up
|
||||
time.sleep(3)
|
||||
|
||||
# use a different URL so that it doesnt interfere with the actual check until we are ready
|
||||
test_url = url_for('test_endpoint', _external=True)
|
||||
res = client.post(
|
||||
url_for("api_watch_add"),
|
||||
data={"url": "https://changedetection.io/CHANGELOG.txt", "tag": ''},
|
||||
follow_redirects=True
|
||||
)
|
||||
assert b"Watch added" in res.data
|
||||
|
||||
time.sleep(10)
|
||||
|
||||
# Check we capture the failure, we can just use trigger_check = y here
|
||||
res = client.post(
|
||||
url_for("edit_page", uuid="first"),
|
||||
data={"notification_urls": "jsons://broken-url.changedetection.io/test",
|
||||
"notification_title": "xxx",
|
||||
"notification_body": "xxxxx",
|
||||
"notification_format": "Text",
|
||||
"url": test_url,
|
||||
"tag": "",
|
||||
"title": "",
|
||||
"headers": "",
|
||||
"minutes_between_check": "180",
|
||||
"fetch_backend": "html_requests",
|
||||
"trigger_check": "y"},
|
||||
follow_redirects=True
|
||||
)
|
||||
assert b"Updated watch." in res.data
|
||||
|
||||
found=False
|
||||
for i in range(1, 10):
|
||||
time.sleep(1)
|
||||
logging.debug("Fetching watch overview....")
|
||||
res = client.get(
|
||||
url_for("index"))
|
||||
|
||||
if bytes("Notification error detected".encode('utf-8')) in res.data:
|
||||
found=True
|
||||
break
|
||||
|
||||
|
||||
assert found
|
||||
|
||||
|
||||
# The error should show in the notification logs
|
||||
res = client.get(
|
||||
url_for("notification_logs"))
|
||||
assert bytes("Name or service not known".encode('utf-8')) in res.data
|
||||
|
||||
|
||||
# And it should be listed on the watch overview
|
||||
211
changedetectionio/tests/test_request.py
Normal file
211
changedetectionio/tests/test_request.py
Normal file
@@ -0,0 +1,211 @@
|
||||
import json
|
||||
import time
|
||||
from flask import url_for
|
||||
from . util import set_original_response, set_modified_response, live_server_setup
|
||||
|
||||
def test_setup(live_server):
|
||||
live_server_setup(live_server)
|
||||
|
||||
# Hard to just add more live server URLs when one test is already running (I think)
|
||||
# So we add our test here (was in a different file)
|
||||
def test_headers_in_request(client, live_server):
|
||||
# Add our URL to the import page
|
||||
test_url = url_for('test_headers', _external=True)
|
||||
|
||||
# Add the test URL twice, we will check
|
||||
res = client.post(
|
||||
url_for("import_page"),
|
||||
data={"urls": test_url},
|
||||
follow_redirects=True
|
||||
)
|
||||
assert b"1 Imported" in res.data
|
||||
|
||||
res = client.post(
|
||||
url_for("import_page"),
|
||||
data={"urls": test_url},
|
||||
follow_redirects=True
|
||||
)
|
||||
assert b"1 Imported" in res.data
|
||||
|
||||
cookie_header = '_ga=GA1.2.1022228332; cookie-preferences=analytics:accepted;'
|
||||
|
||||
|
||||
# Add some headers to a request
|
||||
res = client.post(
|
||||
url_for("edit_page", uuid="first"),
|
||||
data={
|
||||
"url": test_url,
|
||||
"tag": "",
|
||||
"fetch_backend": "html_requests",
|
||||
"headers": "xxx:ooo\ncool:yeah\r\ncookie:"+cookie_header},
|
||||
follow_redirects=True
|
||||
)
|
||||
assert b"Updated watch." in res.data
|
||||
|
||||
|
||||
# Give the thread time to pick up the first version
|
||||
time.sleep(5)
|
||||
|
||||
# The service should echo back the request headers
|
||||
res = client.get(
|
||||
url_for("preview_page", uuid="first"),
|
||||
follow_redirects=True
|
||||
)
|
||||
|
||||
# Flask will convert the header key to uppercase
|
||||
assert b"Xxx:ooo" in res.data
|
||||
assert b"Cool:yeah" in res.data
|
||||
|
||||
# The test call service will return the headers as the body
|
||||
from html import escape
|
||||
assert escape(cookie_header).encode('utf-8') in res.data
|
||||
|
||||
time.sleep(5)
|
||||
|
||||
# Re #137 - Examine the JSON index file, it should have only one set of headers entered
|
||||
watches_with_headers = 0
|
||||
with open('test-datastore/url-watches.json') as f:
|
||||
app_struct = json.load(f)
|
||||
for uuid in app_struct['watching']:
|
||||
if (len(app_struct['watching'][uuid]['headers'])):
|
||||
watches_with_headers += 1
|
||||
|
||||
# Should be only one with headers set
|
||||
assert watches_with_headers==1
|
||||
|
||||
def test_body_in_request(client, live_server):
|
||||
# Add our URL to the import page
|
||||
test_url = url_for('test_body', _external=True)
|
||||
|
||||
# Add the test URL twice, we will check
|
||||
res = client.post(
|
||||
url_for("import_page"),
|
||||
data={"urls": test_url},
|
||||
follow_redirects=True
|
||||
)
|
||||
assert b"1 Imported" in res.data
|
||||
|
||||
res = client.post(
|
||||
url_for("import_page"),
|
||||
data={"urls": test_url},
|
||||
follow_redirects=True
|
||||
)
|
||||
assert b"1 Imported" in res.data
|
||||
|
||||
body_value = 'Test Body Value'
|
||||
|
||||
# Attempt to add a body with a GET method
|
||||
res = client.post(
|
||||
url_for("edit_page", uuid="first"),
|
||||
data={
|
||||
"url": test_url,
|
||||
"tag": "",
|
||||
"method": "GET",
|
||||
"fetch_backend": "html_requests",
|
||||
"body": "invalid"},
|
||||
follow_redirects=True
|
||||
)
|
||||
assert b"Body must be empty when Request Method is set to GET" in res.data
|
||||
|
||||
# Add a properly formatted body with a proper method
|
||||
res = client.post(
|
||||
url_for("edit_page", uuid="first"),
|
||||
data={
|
||||
"url": test_url,
|
||||
"tag": "",
|
||||
"method": "POST",
|
||||
"fetch_backend": "html_requests",
|
||||
"body": body_value},
|
||||
follow_redirects=True
|
||||
)
|
||||
assert b"Updated watch." in res.data
|
||||
|
||||
# Give the thread time to pick up the first version
|
||||
time.sleep(5)
|
||||
|
||||
# The service should echo back the body
|
||||
res = client.get(
|
||||
url_for("preview_page", uuid="first"),
|
||||
follow_redirects=True
|
||||
)
|
||||
|
||||
# Check if body returned contains the specified data
|
||||
assert str.encode(body_value) in res.data
|
||||
|
||||
watches_with_body = 0
|
||||
with open('test-datastore/url-watches.json') as f:
|
||||
app_struct = json.load(f)
|
||||
for uuid in app_struct['watching']:
|
||||
if app_struct['watching'][uuid]['body']==body_value:
|
||||
watches_with_body += 1
|
||||
|
||||
# Should be only one with body set
|
||||
assert watches_with_body==1
|
||||
|
||||
def test_method_in_request(client, live_server):
|
||||
# Add our URL to the import page
|
||||
test_url = url_for('test_method', _external=True)
|
||||
|
||||
# Add the test URL twice, we will check
|
||||
res = client.post(
|
||||
url_for("import_page"),
|
||||
data={"urls": test_url},
|
||||
follow_redirects=True
|
||||
)
|
||||
assert b"1 Imported" in res.data
|
||||
|
||||
res = client.post(
|
||||
url_for("import_page"),
|
||||
data={"urls": test_url},
|
||||
follow_redirects=True
|
||||
)
|
||||
assert b"1 Imported" in res.data
|
||||
|
||||
# Attempt to add a method which is not valid
|
||||
res = client.post(
|
||||
url_for("edit_page", uuid="first"),
|
||||
data={
|
||||
"url": test_url,
|
||||
"tag": "",
|
||||
"fetch_backend": "html_requests",
|
||||
"method": "invalid"},
|
||||
follow_redirects=True
|
||||
)
|
||||
assert b"Not a valid choice" in res.data
|
||||
|
||||
# Add a properly formatted body
|
||||
res = client.post(
|
||||
url_for("edit_page", uuid="first"),
|
||||
data={
|
||||
"url": test_url,
|
||||
"tag": "",
|
||||
"fetch_backend": "html_requests",
|
||||
"method": "PATCH"},
|
||||
follow_redirects=True
|
||||
)
|
||||
assert b"Updated watch." in res.data
|
||||
|
||||
# Give the thread time to pick up the first version
|
||||
time.sleep(5)
|
||||
|
||||
# The service should echo back the request verb
|
||||
res = client.get(
|
||||
url_for("preview_page", uuid="first"),
|
||||
follow_redirects=True
|
||||
)
|
||||
|
||||
# The test call service will return the verb as the body
|
||||
assert b"PATCH" in res.data
|
||||
|
||||
time.sleep(5)
|
||||
|
||||
watches_with_method = 0
|
||||
with open('test-datastore/url-watches.json') as f:
|
||||
app_struct = json.load(f)
|
||||
for uuid in app_struct['watching']:
|
||||
if app_struct['watching'][uuid]['method'] == 'PATCH':
|
||||
watches_with_method += 1
|
||||
|
||||
# Should be only one with method set to PATCH
|
||||
assert watches_with_method == 1
|
||||
|
||||
118
changedetectionio/tests/test_xpath_selector.py
Normal file
118
changedetectionio/tests/test_xpath_selector.py
Normal file
@@ -0,0 +1,118 @@
|
||||
#!/usr/bin/python3
|
||||
|
||||
import time
|
||||
from flask import url_for
|
||||
from . util import live_server_setup
|
||||
|
||||
from ..html_tools import *
|
||||
|
||||
def test_setup(live_server):
|
||||
live_server_setup(live_server)
|
||||
|
||||
def set_original_response():
|
||||
test_return_data = """<html>
|
||||
<body>
|
||||
Some initial text</br>
|
||||
<p>Which is across multiple lines</p>
|
||||
</br>
|
||||
So let's see what happens. </br>
|
||||
<div class="sametext">Some text thats the same</div>
|
||||
<div class="changetext">Some text that will change</div>
|
||||
</body>
|
||||
</html>
|
||||
"""
|
||||
|
||||
with open("test-datastore/endpoint-content.txt", "w") as f:
|
||||
f.write(test_return_data)
|
||||
return None
|
||||
|
||||
def set_modified_response():
|
||||
test_return_data = """<html>
|
||||
<body>
|
||||
Some initial text</br>
|
||||
<p>Which is across multiple lines</p>
|
||||
</br>
|
||||
So let's see what happens. THIS CHANGES AND SHOULDNT TRIGGER A CHANGE</br>
|
||||
<div class="sametext">Some text thats the same</div>
|
||||
<div class="changetext">Some new text</div>
|
||||
</body>
|
||||
</html>
|
||||
"""
|
||||
|
||||
with open("test-datastore/endpoint-content.txt", "w") as f:
|
||||
f.write(test_return_data)
|
||||
|
||||
return None
|
||||
|
||||
|
||||
def test_check_markup_xpath_filter_restriction(client, live_server):
|
||||
sleep_time_for_fetch_thread = 3
|
||||
|
||||
xpath_filter = "//*[contains(@class, 'sametext')]"
|
||||
|
||||
set_original_response()
|
||||
|
||||
# Give the endpoint time to spin up
|
||||
time.sleep(1)
|
||||
|
||||
# Add our URL to the import page
|
||||
test_url = url_for('test_endpoint', _external=True)
|
||||
res = client.post(
|
||||
url_for("import_page"),
|
||||
data={"urls": test_url},
|
||||
follow_redirects=True
|
||||
)
|
||||
assert b"1 Imported" in res.data
|
||||
|
||||
# Trigger a check
|
||||
client.get(url_for("api_watch_checknow"), follow_redirects=True)
|
||||
|
||||
# Give the thread time to pick it up
|
||||
time.sleep(sleep_time_for_fetch_thread)
|
||||
|
||||
# Goto the edit page, add our ignore text
|
||||
# Add our URL to the import page
|
||||
res = client.post(
|
||||
url_for("edit_page", uuid="first"),
|
||||
data={"css_filter": xpath_filter, "url": test_url, "tag": "", "headers": "", 'fetch_backend': "html_requests"},
|
||||
follow_redirects=True
|
||||
)
|
||||
assert b"Updated watch." in res.data
|
||||
|
||||
# Give the thread time to pick it up
|
||||
time.sleep(sleep_time_for_fetch_thread)
|
||||
|
||||
# view it/reset state back to viewed
|
||||
client.get(url_for("diff_history_page", uuid="first"), follow_redirects=True)
|
||||
|
||||
# Make a change
|
||||
set_modified_response()
|
||||
|
||||
# Trigger a check
|
||||
client.get(url_for("api_watch_checknow"), follow_redirects=True)
|
||||
# Give the thread time to pick it up
|
||||
time.sleep(sleep_time_for_fetch_thread)
|
||||
|
||||
res = client.get(url_for("index"))
|
||||
assert b'unviewed' not in res.data
|
||||
|
||||
def test_xpath_validation(client, live_server):
|
||||
|
||||
# Give the endpoint time to spin up
|
||||
time.sleep(1)
|
||||
|
||||
# Add our URL to the import page
|
||||
test_url = url_for('test_endpoint', _external=True)
|
||||
res = client.post(
|
||||
url_for("import_page"),
|
||||
data={"urls": test_url},
|
||||
follow_redirects=True
|
||||
)
|
||||
assert b"1 Imported" in res.data
|
||||
|
||||
res = client.post(
|
||||
url_for("edit_page", uuid="first"),
|
||||
data={"css_filter": "/something horrible", "url": test_url, "tag": "", "headers": "", 'fetch_backend': "html_requests"},
|
||||
follow_redirects=True
|
||||
)
|
||||
assert b"is not a valid XPath expression" in res.data
|
||||
@@ -37,6 +37,16 @@ def set_modified_response():
|
||||
|
||||
def live_server_setup(live_server):
|
||||
|
||||
@live_server.app.route('/test-binary-endpoint')
|
||||
def test_binaryfile_endpoint():
|
||||
|
||||
from flask import make_response
|
||||
|
||||
# Tried using a global var here but didn't seem to work, so reading from a file instead.
|
||||
with open("test-datastore/test.bin", "rb") as f:
|
||||
resp = make_response(f.read())
|
||||
resp.headers['Content-Type'] = 'image/jpeg'
|
||||
return resp
|
||||
|
||||
@live_server.app.route('/test-endpoint')
|
||||
def test_endpoint():
|
||||
@@ -44,6 +54,23 @@ def live_server_setup(live_server):
|
||||
with open("test-datastore/endpoint-content.txt", "r") as f:
|
||||
return f.read()
|
||||
|
||||
@live_server.app.route('/test-endpoint-json')
|
||||
def test_endpoint_json():
|
||||
|
||||
from flask import make_response
|
||||
|
||||
with open("test-datastore/endpoint-content.txt", "r") as f:
|
||||
resp = make_response(f.read())
|
||||
resp.headers['Content-Type'] = 'application/json'
|
||||
return resp
|
||||
|
||||
@live_server.app.route('/test-403')
|
||||
def test_endpoint_403_error():
|
||||
|
||||
from flask import make_response
|
||||
resp = make_response('', 403)
|
||||
return resp
|
||||
|
||||
# Just return the headers in the request
|
||||
@live_server.app.route('/test-headers')
|
||||
def test_headers():
|
||||
@@ -56,6 +83,21 @@ def live_server_setup(live_server):
|
||||
|
||||
return "\n".join(output)
|
||||
|
||||
# Just return the body in the request
|
||||
@live_server.app.route('/test-body', methods=['POST', 'GET'])
|
||||
def test_body():
|
||||
|
||||
from flask import request
|
||||
|
||||
return request.data
|
||||
|
||||
# Just return the verb in the request
|
||||
@live_server.app.route('/test-method', methods=['POST', 'GET', 'PATCH'])
|
||||
def test_method():
|
||||
|
||||
from flask import request
|
||||
|
||||
return request.method
|
||||
|
||||
# Where we POST to as a notification
|
||||
@live_server.app.route('/test_notification_endpoint', methods=['POST', 'GET'])
|
||||
@@ -71,4 +113,14 @@ def live_server_setup(live_server):
|
||||
print("\n>> Test notification endpoint was hit.\n")
|
||||
return "Text was set"
|
||||
|
||||
|
||||
# Just return the verb in the request
|
||||
@live_server.app.route('/test-basicauth', methods=['GET'])
|
||||
def test_basicauth_method():
|
||||
|
||||
from flask import request
|
||||
auth = request.authorization
|
||||
ret = " ".join([auth.username, auth.password, auth.type])
|
||||
return ret
|
||||
|
||||
live_server.start()
|
||||
|
||||
@@ -2,7 +2,12 @@ import threading
|
||||
import queue
|
||||
import time
|
||||
|
||||
# Requests for checking on the site use a pool of thread Workers managed by a Queue.
|
||||
# A single update worker
|
||||
#
|
||||
# Requests for checking on a single site(watch) from a queue of watches
|
||||
# (another process inserts watches into the queue that are time-ready for checking)
|
||||
|
||||
|
||||
class update_worker(threading.Thread):
|
||||
current_uuid = None
|
||||
|
||||
@@ -34,92 +39,107 @@ class update_worker(threading.Thread):
|
||||
changed_detected = False
|
||||
contents = ""
|
||||
update_obj= {}
|
||||
now = time.time()
|
||||
|
||||
try:
|
||||
now = time.time()
|
||||
changed_detected, update_obj, contents = update_handler.run(uuid)
|
||||
|
||||
# Always record that we atleast tried
|
||||
self.datastore.update_watch(uuid=uuid, update_obj={'fetch_time': round(time.time() - now, 3)})
|
||||
# Re #342
|
||||
# In Python 3, all strings are sequences of Unicode characters. There is a bytes type that holds raw bytes.
|
||||
# We then convert/.decode('utf-8') for the notification etc
|
||||
if not isinstance(contents, (bytes, bytearray)):
|
||||
raise Exception("Error - returned data from the fetch handler SHOULD be bytes")
|
||||
|
||||
|
||||
except PermissionError as e:
|
||||
self.app.logger.error("File permission error updating", uuid, str(e))
|
||||
except content_fetcher.EmptyReply as e:
|
||||
self.datastore.update_watch(uuid=uuid, update_obj={'last_error':str(e)})
|
||||
|
||||
# Some kind of custom to-str handler in the exception handler that does this?
|
||||
err_text = "EmptyReply: Status Code {}".format(e.status_code)
|
||||
self.datastore.update_watch(uuid=uuid, update_obj={'last_error': err_text,
|
||||
'last_check_status': e.status_code})
|
||||
except Exception as e:
|
||||
self.app.logger.error("Exception reached processing watch UUID:%s - %s", uuid, str(e))
|
||||
self.app.logger.error("Exception reached processing watch UUID: %s - %s", uuid, str(e))
|
||||
self.datastore.update_watch(uuid=uuid, update_obj={'last_error': str(e)})
|
||||
|
||||
else:
|
||||
if update_obj:
|
||||
try:
|
||||
self.datastore.update_watch(uuid=uuid, update_obj=update_obj)
|
||||
if changed_detected:
|
||||
n_object = {}
|
||||
# A change was detected
|
||||
fname = self.datastore.save_history_text(watch_uuid=uuid, contents=contents)
|
||||
try:
|
||||
watch = self.datastore.data['watching'][uuid]
|
||||
fname = "" # Saved history text filename
|
||||
|
||||
# Update history with the stripped text for future reference, this will also mean we save the first
|
||||
# Should always be keyed by string(timestamp)
|
||||
self.datastore.update_watch(uuid, {"history": {str(update_obj["last_checked"]): fname}})
|
||||
# For the FIRST time we check a site, or a change detected, save the snapshot.
|
||||
if changed_detected or not watch['last_checked']:
|
||||
# A change was detected
|
||||
fname = self.datastore.save_history_text(watch_uuid=uuid, contents=contents)
|
||||
# Should always be keyed by string(timestamp)
|
||||
self.datastore.update_watch(uuid, {"history": {str(round(time.time())): fname}})
|
||||
|
||||
watch = self.datastore.data['watching'][uuid]
|
||||
# Generally update anything interesting returned
|
||||
self.datastore.update_watch(uuid=uuid, update_obj=update_obj)
|
||||
|
||||
print (">> Change detected in UUID {} - {}".format(uuid, watch['url']))
|
||||
# A change was detected
|
||||
if changed_detected:
|
||||
n_object = {}
|
||||
print (">> Change detected in UUID {} - {}".format(uuid, watch['url']))
|
||||
|
||||
# Notifications should only trigger on the second time (first time, we gather the initial snapshot)
|
||||
if len(watch['history']) > 1:
|
||||
# Notifications should only trigger on the second time (first time, we gather the initial snapshot)
|
||||
if len(watch['history']) > 1:
|
||||
|
||||
dates = list(watch['history'].keys())
|
||||
# Convert to int, sort and back to str again
|
||||
# @todo replace datastore getter that does this automatically
|
||||
dates = [int(i) for i in dates]
|
||||
dates.sort(reverse=True)
|
||||
dates = [str(i) for i in dates]
|
||||
dates = list(watch['history'].keys())
|
||||
# Convert to int, sort and back to str again
|
||||
# @todo replace datastore getter that does this automatically
|
||||
dates = [int(i) for i in dates]
|
||||
dates.sort(reverse=True)
|
||||
dates = [str(i) for i in dates]
|
||||
|
||||
prev_fname = watch['history'][dates[1]]
|
||||
prev_fname = watch['history'][dates[1]]
|
||||
|
||||
|
||||
# Did it have any notification alerts to hit?
|
||||
if len(watch['notification_urls']):
|
||||
print(">>> Notifications queued for UUID from watch {}".format(uuid))
|
||||
n_object['notification_urls'] = watch['notification_urls']
|
||||
n_object['notification_title'] = watch['notification_title']
|
||||
n_object['notification_body'] = watch['notification_body']
|
||||
n_object['notification_format'] = watch['notification_format']
|
||||
# Did it have any notification alerts to hit?
|
||||
if len(watch['notification_urls']):
|
||||
print(">>> Notifications queued for UUID from watch {}".format(uuid))
|
||||
n_object['notification_urls'] = watch['notification_urls']
|
||||
n_object['notification_title'] = watch['notification_title']
|
||||
n_object['notification_body'] = watch['notification_body']
|
||||
n_object['notification_format'] = watch['notification_format']
|
||||
|
||||
# No? maybe theres a global setting, queue them all
|
||||
elif len(self.datastore.data['settings']['application']['notification_urls']):
|
||||
print(">>> Watch notification URLs were empty, using GLOBAL notifications for UUID: {}".format(uuid))
|
||||
n_object['notification_urls'] = self.datastore.data['settings']['application']['notification_urls']
|
||||
n_object['notification_title'] = self.datastore.data['settings']['application']['notification_title']
|
||||
n_object['notification_body'] = self.datastore.data['settings']['application']['notification_body']
|
||||
n_object['notification_format'] = self.datastore.data['settings']['application']['notification_format']
|
||||
# No? maybe theres a global setting, queue them all
|
||||
elif len(self.datastore.data['settings']['application']['notification_urls']):
|
||||
print(">>> Watch notification URLs were empty, using GLOBAL notifications for UUID: {}".format(uuid))
|
||||
n_object['notification_urls'] = self.datastore.data['settings']['application']['notification_urls']
|
||||
n_object['notification_title'] = self.datastore.data['settings']['application']['notification_title']
|
||||
n_object['notification_body'] = self.datastore.data['settings']['application']['notification_body']
|
||||
n_object['notification_format'] = self.datastore.data['settings']['application']['notification_format']
|
||||
else:
|
||||
print(">>> NO notifications queued, watch and global notification URLs were empty.")
|
||||
|
||||
# Only prepare to notify if the rules above matched
|
||||
if 'notification_urls' in n_object:
|
||||
# HTML needs linebreak, but MarkDown and Text can use a linefeed
|
||||
if n_object['notification_format'] == 'HTML':
|
||||
line_feed_sep = "</br>"
|
||||
else:
|
||||
print(">>> NO notifications queued, watch and global notification URLs were empty.")
|
||||
line_feed_sep = "\n"
|
||||
|
||||
# Only prepare to notify if the rules above matched
|
||||
if 'notification_urls' in n_object:
|
||||
# HTML needs linebreak, but MarkDown and Text can use a linefeed
|
||||
if n_object['notification_format'] == 'HTML':
|
||||
line_feed_sep = "</br>"
|
||||
else:
|
||||
line_feed_sep = "\n"
|
||||
from changedetectionio import diff
|
||||
n_object.update({
|
||||
'watch_url': watch['url'],
|
||||
'uuid': uuid,
|
||||
'current_snapshot': contents.decode('utf-8'),
|
||||
'diff': diff.render_diff(prev_fname, fname, line_feed_sep=line_feed_sep),
|
||||
'diff_full': diff.render_diff(prev_fname, fname, True, line_feed_sep=line_feed_sep)
|
||||
})
|
||||
|
||||
from changedetectionio import diff
|
||||
n_object.update({
|
||||
'watch_url': watch['url'],
|
||||
'uuid': uuid,
|
||||
'current_snapshot': str(contents),
|
||||
'diff_full': diff.render_diff(prev_fname, fname, line_feed_sep=line_feed_sep),
|
||||
'diff': diff.render_diff(prev_fname, fname, True, line_feed_sep=line_feed_sep)
|
||||
})
|
||||
self.notification_q.put(n_object)
|
||||
|
||||
self.notification_q.put(n_object)
|
||||
|
||||
except Exception as e:
|
||||
print("!!!! Exception in update_worker !!!\n", e)
|
||||
except Exception as e:
|
||||
# Catch everything possible here, so that if a worker crashes, we don't lose it until restart!
|
||||
self.app.logger.error("Exception reached processing watch UUID: %s - %s", uuid, str(e))
|
||||
self.datastore.update_watch(uuid=uuid, update_obj={'last_error': str(e)})
|
||||
finally:
|
||||
# Always record that we atleast tried
|
||||
self.datastore.update_watch(uuid=uuid, update_obj={'fetch_time': round(time.time() - now, 3),
|
||||
'last_checked': round(time.time())})
|
||||
|
||||
self.current_uuid = None # Done
|
||||
self.q.task_done()
|
||||
|
||||
@@ -13,13 +13,23 @@ services:
|
||||
|
||||
# - PUID=1000
|
||||
# - PGID=1000
|
||||
# # Alternative WebDriver/selenium URL, do not use "'s or 's!
|
||||
#
|
||||
# Alternative WebDriver/selenium URL, do not use "'s or 's!
|
||||
# - WEBDRIVER_URL=http://browser-chrome:4444/wd/hub
|
||||
# Proxy support example.
|
||||
#
|
||||
# WebDriver proxy settings webdriver_proxyType, webdriver_ftpProxy, webdriver_httpProxy, webdriver_noProxy,
|
||||
# webdriver_proxyAutoconfigUrl, webdriver_sslProxy, webdriver_autodetect,
|
||||
# webdriver_socksProxy, webdriver_socksUsername, webdriver_socksVersion, webdriver_socksPassword
|
||||
#
|
||||
# https://selenium-python.readthedocs.io/api.html#module-selenium.webdriver.common.proxy
|
||||
#
|
||||
# Plain requsts - proxy support example.
|
||||
# - HTTP_PROXY=socks5h://10.10.1.10:1080
|
||||
# - HTTPS_PROXY=socks5h://10.10.1.10:1080
|
||||
#
|
||||
# An exclude list (useful for notification URLs above) can be specified by with
|
||||
# - NO_PROXY="localhost,192.168.0.0/24"
|
||||
#
|
||||
# Base URL of your changedetection.io install (Added to the notification alert)
|
||||
# - BASE_URL=https://mysite.com
|
||||
|
||||
@@ -33,7 +43,8 @@ services:
|
||||
restart: unless-stopped
|
||||
|
||||
# Used for fetching pages via WebDriver+Chrome where you need Javascript support.
|
||||
# Does not work on rPi, https://github.com/dgtlmoon/changedetection.io/wiki/Fetching-pages-with-WebDriver
|
||||
# Now working on arm64 (needs testing on rPi - tested on Oracle ARM instance)
|
||||
# replace image with seleniarm/standalone-chromium:4.0.0-20211213
|
||||
|
||||
# browser-chrome:
|
||||
# hostname: browser-chrome
|
||||
|
||||
@@ -17,7 +17,7 @@ wtforms ~= 2.3.3
|
||||
jsonpath-ng ~= 1.5.3
|
||||
|
||||
# Notification library
|
||||
apprise ~= 0.9
|
||||
apprise ~= 0.9.6
|
||||
|
||||
# apprise mqtt https://github.com/dgtlmoon/changedetection.io/issues/315
|
||||
paho-mqtt
|
||||
@@ -26,7 +26,13 @@ paho-mqtt
|
||||
# ERROR: Could not build wheels for cryptography which use PEP 517 and cannot be installed directly
|
||||
cryptography ~= 3.4
|
||||
|
||||
# Used for CSS filtering, replace with soupsieve and lxml for xpath
|
||||
# Used for CSS filtering
|
||||
bs4
|
||||
|
||||
selenium ~= 3.141
|
||||
# XPath filtering, lxml is required by bs4 anyway, but put it here to be safe.
|
||||
lxml
|
||||
|
||||
# 3.141 was missing socksVersion, 3.150 was not in pypi, so we try 4.1.0
|
||||
selenium ~= 4.1.0
|
||||
pytest ~=6.2
|
||||
pytest-flask ~=1.2
|
||||
|
||||
Reference in New Issue
Block a user