Switch to a Vue-based web app
* Init Webpack / Babel / etc setup. * Build the app using Vue, Vue-router, Vuex. * i18n Some backends changes were made to match the webapp development: * Return the flat status as a single string ("new" rather than "FlatStatus.new") * Completely switch to calling Weboob API directly for fetching * Use Canister for Bottle logging * Handle merging of details dict better * Add a WSGI script * Keep track of duplicates * Webserver had to be restarted to fetch external changes to the db * Handle leboncoin module better Also add contributions guidelines. Closes issue #3 Closes issue #14.
This commit is contained in:
parent
4966fe2111
commit
a57d9ce8e3
4
.babelrc
Normal file
4
.babelrc
Normal file
@ -0,0 +1,4 @@
|
||||
{
|
||||
"presets": ["es2015", "stage-0"],
|
||||
"plugins": ["transform-runtime"]
|
||||
}
|
10
.eslintrc
Normal file
10
.eslintrc
Normal file
@ -0,0 +1,10 @@
|
||||
{
|
||||
extends: ["vue", /* your other extends */],
|
||||
plugins: ["vue"],
|
||||
"env": {
|
||||
"browser": true
|
||||
},
|
||||
rules: {
|
||||
'indent': ["error", 4, { 'SwitchCase': 1 }],
|
||||
}
|
||||
}
|
4
.gitignore
vendored
4
.gitignore
vendored
@ -1,6 +1,8 @@
|
||||
build
|
||||
*.json
|
||||
*.pyc
|
||||
*.swp
|
||||
*.swo
|
||||
*.db
|
||||
config/
|
||||
node_modules
|
||||
flatisfy/web/static/js
|
||||
|
46
CONTRIBUTING.md
Normal file
46
CONTRIBUTING.md
Normal file
@ -0,0 +1,46 @@
|
||||
## TL;DR
|
||||
|
||||
* Branch off `master`.
|
||||
* One feature per commit.
|
||||
* In case of changes request, amend your commit.
|
||||
|
||||
|
||||
## Useful infos
|
||||
|
||||
* There is a `hooks/pre-commit` file which can be used as a `pre-commit` git
|
||||
hook to check coding style.
|
||||
* Python coding style is PEP8. JS coding style is enforced by `eslint`.
|
||||
* Some useful `npm` scripts are provided (`build` / `watch` / `lint`)
|
||||
|
||||
|
||||
## Translating the webapp
|
||||
|
||||
If you want to translate the webapp, just create a new folder in
|
||||
`flatisfy/web/js_src/i18n` with the short name of your locale (typically, `en`
|
||||
is for english). Copy the `flatisfy/web/js_src/i18n/en/index.js` file to this
|
||||
new folder and translate the `messages` strings.
|
||||
|
||||
Then, edit `flatisfy/web/js_src/i18n/index.js` file to include your new
|
||||
locale.
|
||||
|
||||
|
||||
## How to contribute
|
||||
|
||||
* If you're thinking about a new feature, see if there's already an issue open
|
||||
about it, or please open one otherwise. This will ensure that everybody is on
|
||||
track for the feature and willing to see it in Flatisfy.
|
||||
* One commit per feature.
|
||||
* Branch off the `master ` branch.
|
||||
* Check the linting of your code before doing a PR.
|
||||
* Ideally, your merge-request should be mergeable without any merge commit, that
|
||||
is, it should be a fast-forward merge. For this to happen, your code needs to
|
||||
be always rebased onto `master`. Again, this is something nice to have that
|
||||
I expect from recurring contributors, but not a big deal if you don't do it
|
||||
otherwise.
|
||||
* I'll look at it and might ask for a few changes. In this case, please create
|
||||
new commits. When the final result looks good, I may ask you to squash the
|
||||
WIP commits into a single one, to maintain the invariant of "one feature, one
|
||||
commit".
|
||||
|
||||
|
||||
Thanks!
|
@ -104,6 +104,11 @@ The content of this repository is licensed under an MIT license, unless
|
||||
explicitly mentionned otherwise.
|
||||
|
||||
|
||||
## Contributing
|
||||
|
||||
See the `CONTRIBUTING.md` file for more infos.
|
||||
|
||||
|
||||
## Thanks
|
||||
|
||||
* [Weboob](http://weboob.org/)
|
||||
|
@ -90,6 +90,10 @@ def parse_args(argv=None):
|
||||
subparsers.add_parser("import", parents=[parent_parser],
|
||||
help="Import housing posts in database.")
|
||||
|
||||
# Purge subcommand parser
|
||||
subparsers.add_parser("purge", parents=[parent_parser],
|
||||
help="Purge database.")
|
||||
|
||||
# Serve subcommand parser
|
||||
parser_serve = subparsers.add_parser("serve", parents=[parent_parser],
|
||||
help="Serve the web app.")
|
||||
@ -103,6 +107,7 @@ def main():
|
||||
"""
|
||||
Main module code.
|
||||
"""
|
||||
# pylint: disable=locally-disabled,too-many-branches
|
||||
# Parse arguments
|
||||
args = parse_args()
|
||||
|
||||
@ -163,7 +168,12 @@ def main():
|
||||
)
|
||||
# Import command
|
||||
elif args.cmd == "import":
|
||||
# TODO: Do not fetch details for already imported flats / use the last
|
||||
# timestamp
|
||||
cmds.import_and_filter(config)
|
||||
# Purge command
|
||||
elif args.cmd == "purge":
|
||||
cmds.purge_db(config)
|
||||
# Serve command
|
||||
elif args.cmd == "serve":
|
||||
cmds.serve(config)
|
||||
|
@ -4,6 +4,8 @@ Main commands available for flatisfy.
|
||||
"""
|
||||
from __future__ import absolute_import, print_function, unicode_literals
|
||||
|
||||
import logging
|
||||
|
||||
import flatisfy.filters
|
||||
from flatisfy import database
|
||||
from flatisfy.models import flat as flat_model
|
||||
@ -12,6 +14,9 @@ from flatisfy import tools
|
||||
from flatisfy.web import app as web_app
|
||||
|
||||
|
||||
LOGGER = logging.getLogger(__name__)
|
||||
|
||||
|
||||
def fetch_and_filter(config):
|
||||
"""
|
||||
Fetch the available flats list. Then, filter it according to criteria.
|
||||
@ -34,9 +39,9 @@ def fetch_and_filter(config):
|
||||
# additional infos
|
||||
if config["passes"] > 1:
|
||||
# Load additional infos
|
||||
for flat in flats_list:
|
||||
details = fetch.fetch_details(flat["id"])
|
||||
flat = tools.merge_dicts(flat, details)
|
||||
for i, flat in enumerate(flats_list):
|
||||
details = fetch.fetch_details(config, flat["id"])
|
||||
flats_list[i] = tools.merge_dicts(flat, details)
|
||||
|
||||
flats_list, extra_ignored_flats = flatisfy.filters.second_pass(
|
||||
flats_list, config
|
||||
@ -83,7 +88,7 @@ def import_and_filter(config):
|
||||
:return: ``None``.
|
||||
"""
|
||||
# Fetch and filter flats list
|
||||
flats_list, purged_list = fetch_and_filter(config)
|
||||
flats_list, ignored_list = fetch_and_filter(config)
|
||||
# Create database connection
|
||||
get_session = database.init_db(config["database"])
|
||||
|
||||
@ -92,12 +97,27 @@ def import_and_filter(config):
|
||||
flat = flat_model.Flat.from_dict(flat_dict)
|
||||
session.merge(flat)
|
||||
|
||||
for flat_dict in purged_list:
|
||||
for flat_dict in ignored_list:
|
||||
flat = flat_model.Flat.from_dict(flat_dict)
|
||||
flat.status = flat_model.FlatStatus.purged
|
||||
flat.status = flat_model.FlatStatus.ignored
|
||||
session.merge(flat)
|
||||
|
||||
|
||||
def purge_db(config):
|
||||
"""
|
||||
Purge the database.
|
||||
|
||||
:param config: A config dict.
|
||||
:return: ``None``
|
||||
"""
|
||||
get_session = database.init_db(config["database"])
|
||||
|
||||
with get_session() as session:
|
||||
# Delete every flat in the db
|
||||
LOGGER.info("Purge all flats from the database.")
|
||||
session.query(flat_model.Flat).delete(synchronize_session=False)
|
||||
|
||||
|
||||
def serve(config):
|
||||
"""
|
||||
Serve the web app.
|
||||
@ -106,5 +126,11 @@ def serve(config):
|
||||
:return: ``None``, long-running process.
|
||||
"""
|
||||
app = web_app.get_app(config)
|
||||
# TODO: Make Bottle use logging module
|
||||
app.run(host=config["host"], port=config["port"])
|
||||
|
||||
server = config.get("webserver", None)
|
||||
if not server:
|
||||
# Default webserver is quiet, as Bottle is used with Canister for
|
||||
# standard logging
|
||||
server = web_app.QuietWSGIRefServer
|
||||
|
||||
app.run(host=config["host"], port=config["port"], server=server)
|
||||
|
@ -21,10 +21,11 @@ from flatisfy import tools
|
||||
|
||||
# Default configuration
|
||||
DEFAULT_CONFIG = {
|
||||
# Flatboob queries to fetch
|
||||
"queries": [],
|
||||
# Constraints to match
|
||||
"constraints": {
|
||||
"type": None, # RENT, SALE, SHARING
|
||||
"house_types": [], # List of house types, must be in APART, HOUSE,
|
||||
# PARKING, LAND, OTHER or UNKNOWN
|
||||
"postal_codes": [], # List of postal codes
|
||||
"area": (None, None), # (min, max) in m^2
|
||||
"cost": (None, None), # (min, max) in currency unit
|
||||
@ -42,12 +43,18 @@ DEFAULT_CONFIG = {
|
||||
"max_entries": None,
|
||||
# Directory in wich data will be put. ``None`` is XDG default location.
|
||||
"data_directory": None,
|
||||
# Path to the modules directory containing all Weboob modules. ``None`` if
|
||||
# ``weboob_modules`` package is pip-installed, and you want to use
|
||||
# ``pkgresource`` to automatically find it.
|
||||
"modules_path": None,
|
||||
# SQLAlchemy URI to the database to use
|
||||
"database": None,
|
||||
# Web app port
|
||||
"port": 8080,
|
||||
# Web app host to listen on
|
||||
"host": "127.0.0.1"
|
||||
"host": "127.0.0.1",
|
||||
# Web server to use to serve the webapp (see Bottle deployment doc)
|
||||
"webserver": None
|
||||
}
|
||||
|
||||
LOGGER = logging.getLogger(__name__)
|
||||
@ -68,7 +75,7 @@ def validate_config(config):
|
||||
assert all(
|
||||
x is None or
|
||||
(
|
||||
(isinstance(x, int) or isinstance(x, float)) and
|
||||
isinstance(x, (float, int)) and
|
||||
x >= 0
|
||||
)
|
||||
for x in bounds
|
||||
@ -81,9 +88,19 @@ def validate_config(config):
|
||||
# Then, we disable line-too-long pylint check and E501 flake8 checks
|
||||
# and use long lines whenever needed, in order to have the full assert
|
||||
# message in the log output.
|
||||
# pylint: disable=line-too-long
|
||||
# pylint: disable=locally-disabled,line-too-long
|
||||
assert "type" in config["constraints"]
|
||||
assert config["constraints"]["type"].upper() in ["RENT",
|
||||
"SALE", "SHARING"]
|
||||
|
||||
assert "house_types" in config["constraints"]
|
||||
assert config["constraints"]["house_types"]
|
||||
for house_type in config["constraints"]["house_types"]:
|
||||
assert house_type.upper() in ["APART", "HOUSE", "PARKING", "LAND",
|
||||
"OTHER", "UNKNOWN"]
|
||||
|
||||
assert "postal_codes" in config["constraints"]
|
||||
assert len(config["constraints"]["postal_codes"]) > 0
|
||||
assert config["constraints"]["postal_codes"]
|
||||
|
||||
assert "area" in config["constraints"]
|
||||
_check_constraints_bounds(config["constraints"]["area"])
|
||||
@ -111,11 +128,13 @@ def validate_config(config):
|
||||
assert config["max_entries"] is None or (isinstance(config["max_entries"], int) and config["max_entries"] > 0) # noqa: E501
|
||||
|
||||
assert config["data_directory"] is None or isinstance(config["data_directory"], str) # noqa: E501
|
||||
assert config["modules_path"] is None or isinstance(config["modules_path"], str) # noqa: E501
|
||||
|
||||
assert config["database"] is None or isinstance(config["database"], str) # noqa: E501
|
||||
|
||||
assert isinstance(config["port"], int)
|
||||
assert isinstance(config["host"], str)
|
||||
assert config["webserver"] is None or isinstance(config["webserver"], str) # noqa: E501
|
||||
|
||||
return True
|
||||
except (AssertionError, KeyError):
|
||||
@ -140,10 +159,11 @@ def load_config(args=None):
|
||||
try:
|
||||
with open(args.config, "r") as fh:
|
||||
config_data.update(json.load(fh))
|
||||
except (IOError, ValueError):
|
||||
except (IOError, ValueError) as exc:
|
||||
LOGGER.error(
|
||||
"Unable to load configuration from file, "
|
||||
"using default configuration."
|
||||
"using default configuration: %s.",
|
||||
exc
|
||||
)
|
||||
|
||||
# Overload config with arguments
|
||||
@ -188,9 +208,8 @@ def load_config(args=None):
|
||||
if config_validation is True:
|
||||
LOGGER.info("Config has been fully initialized.")
|
||||
return config_data
|
||||
else:
|
||||
LOGGER.error("Error in configuration: %s.", config_validation)
|
||||
return None
|
||||
LOGGER.error("Error in configuration: %s.", config_validation)
|
||||
return None
|
||||
|
||||
|
||||
def init_config(output=None):
|
||||
|
@ -9,6 +9,7 @@ import collections
|
||||
import json
|
||||
import logging
|
||||
import os
|
||||
import shutil
|
||||
|
||||
import flatisfy.exceptions
|
||||
|
||||
@ -157,7 +158,7 @@ def load_data(data_type, config):
|
||||
LOGGER.error("Invalid JSON data file: %s.", datafile_path)
|
||||
return None
|
||||
|
||||
if len(data) == 0:
|
||||
if not data:
|
||||
LOGGER.warning("Loading empty data for %s.", data_type)
|
||||
|
||||
return data
|
||||
|
@ -41,16 +41,18 @@ def init_db(database_uri=None):
|
||||
|
||||
engine = create_engine(database_uri)
|
||||
BASE.metadata.create_all(engine, checkfirst=True)
|
||||
Session = sessionmaker(bind=engine) # pylint: disable=invalid-name
|
||||
Session = sessionmaker(bind=engine) # pylint: disable=locally-disabled,invalid-name
|
||||
|
||||
@contextmanager
|
||||
def get_session():
|
||||
# pylint: disable=locally-disabled,line-too-long
|
||||
"""
|
||||
Provide a transactional scope around a series of operations.
|
||||
|
||||
From [1].
|
||||
[1]: http://docs.sqlalchemy.org/en/latest/orm/session_basics.html#when-do-i-construct-a-session-when-do-i-commit-it-and-when-do-i-close-it.
|
||||
"""
|
||||
# pylint: enable=line-too-long,locally-disabled
|
||||
session = Session()
|
||||
try:
|
||||
yield session
|
||||
|
@ -46,5 +46,5 @@ class StringyJSON(types.TypeDecorator):
|
||||
|
||||
# TypeEngine.with_variant says "use StringyJSON instead when
|
||||
# connecting to 'sqlite'"
|
||||
# pylint: disable=invalid-name
|
||||
# pylint: disable=locally-disabled,invalid-name
|
||||
MagicJSON = types.JSON().with_variant(StringyJSON, 'sqlite')
|
||||
|
@ -4,14 +4,159 @@ This module contains all the code related to fetching and loading flats lists.
|
||||
"""
|
||||
from __future__ import absolute_import, print_function, unicode_literals
|
||||
|
||||
import itertools
|
||||
import json
|
||||
import logging
|
||||
import subprocess
|
||||
|
||||
from flatisfy import data
|
||||
from flatisfy import tools
|
||||
|
||||
LOGGER = logging.getLogger(__name__)
|
||||
|
||||
|
||||
try:
|
||||
from weboob.capabilities.housing import Query
|
||||
from weboob.core.ouiboube import WebNip
|
||||
from weboob.tools.json import WeboobEncoder
|
||||
except ImportError:
|
||||
LOGGER.error("Weboob is not available on your system. Make sure you "
|
||||
"installed it.")
|
||||
raise
|
||||
|
||||
|
||||
class WeboobProxy(object):
|
||||
"""
|
||||
Wrapper around Weboob ``WebNip`` class, to fetch housing posts without
|
||||
having to spawn a subprocess.
|
||||
"""
|
||||
@staticmethod
|
||||
def version():
|
||||
"""
|
||||
Get Weboob version.
|
||||
|
||||
:return: The installed Weboob version.
|
||||
"""
|
||||
return WebNip.VERSION
|
||||
|
||||
def __init__(self, config):
|
||||
"""
|
||||
Create a Weboob handle and try to load the modules.
|
||||
|
||||
:param config: A config dict.
|
||||
"""
|
||||
# Create base WebNip object
|
||||
self.webnip = WebNip(modules_path=config["modules_path"])
|
||||
|
||||
# Create backends
|
||||
self.backends = [
|
||||
self.webnip.load_backend(
|
||||
module,
|
||||
module,
|
||||
params={}
|
||||
)
|
||||
for module in ["seloger", "pap", "leboncoin", "logicimmo",
|
||||
"explorimmo", "entreparticuliers"]
|
||||
]
|
||||
|
||||
def __enter__(self):
|
||||
return self
|
||||
|
||||
def __exit__(self, *args):
|
||||
self.webnip.deinit()
|
||||
|
||||
def build_queries(self, constraints_dict):
|
||||
"""
|
||||
Build Weboob ``weboob.capabilities.housing.Query`` objects from the
|
||||
constraints defined in the configuration. Each query has at most 3
|
||||
postal codes, to comply with housing websites limitations.
|
||||
|
||||
:param constraints_dict: A dictionary of constraints, as defined in the
|
||||
config.
|
||||
:return: A list of Weboob ``weboob.capabilities.housing.Query``
|
||||
objects. Returns ``None`` if an error occurred.
|
||||
"""
|
||||
queries = []
|
||||
for postal_codes in tools.batch(constraints_dict["postal_codes"], 3):
|
||||
query = Query()
|
||||
query.cities = []
|
||||
for postal_code in postal_codes:
|
||||
try:
|
||||
for city in self.webnip.do("search_city", postal_code):
|
||||
query.cities.append(city)
|
||||
except IndexError:
|
||||
LOGGER.error(
|
||||
"Postal code %s could not be matched with a city.",
|
||||
postal_code
|
||||
)
|
||||
return None
|
||||
|
||||
try:
|
||||
query.house_types = [
|
||||
getattr(
|
||||
Query.HOUSE_TYPES,
|
||||
house_type.upper()
|
||||
)
|
||||
for house_type in constraints_dict["house_types"]
|
||||
]
|
||||
except AttributeError:
|
||||
LOGGER.error("Invalid house types constraint.")
|
||||
return None
|
||||
|
||||
try:
|
||||
query.type = getattr(
|
||||
Query,
|
||||
"TYPE_{}".format(constraints_dict["type"].upper())
|
||||
)
|
||||
except AttributeError:
|
||||
LOGGER.error("Invalid post type constraint.")
|
||||
return None
|
||||
|
||||
query.area_min = constraints_dict["area"][0]
|
||||
query.area_max = constraints_dict["area"][1]
|
||||
query.cost_min = constraints_dict["cost"][0]
|
||||
query.cost_max = constraints_dict["cost"][1]
|
||||
query.nb_rooms = constraints_dict["rooms"][0]
|
||||
|
||||
queries.append(query)
|
||||
|
||||
return queries
|
||||
|
||||
def query(self, query, max_entries=None):
|
||||
"""
|
||||
Fetch the housings posts matching a given Weboob query.
|
||||
|
||||
:param query: A Weboob `weboob.capabilities.housing.Query`` object.
|
||||
:param max_entries: Maximum number of entries to fetch.
|
||||
:return: The matching housing posts, dumped as a list of JSON objects.
|
||||
"""
|
||||
housings = []
|
||||
# TODO: Handle max_entries better
|
||||
for housing in itertools.islice(
|
||||
self.webnip.do('search_housings', query),
|
||||
max_entries
|
||||
):
|
||||
housings.append(json.dumps(housing, cls=WeboobEncoder))
|
||||
return housings
|
||||
|
||||
def info(self, full_flat_id):
|
||||
"""
|
||||
Get information (details) about an housing post.
|
||||
|
||||
:param full_flat_id: A Weboob housing post id, in complete form
|
||||
(ID@BACKEND)
|
||||
:return: The details in JSON.
|
||||
"""
|
||||
flat_id, backend_name = full_flat_id.rsplit("@", 1)
|
||||
backend = next(
|
||||
backend
|
||||
for backend in self.backends
|
||||
if backend.name == backend_name
|
||||
)
|
||||
housing = backend.get_housing(flat_id)
|
||||
housing.id = full_flat_id # Otherwise, we miss the @backend afterwards
|
||||
return json.dumps(housing, cls=WeboobEncoder)
|
||||
|
||||
|
||||
def fetch_flats_list(config):
|
||||
"""
|
||||
Fetch the available flats using the Flatboob / Weboob config.
|
||||
@ -20,40 +165,35 @@ def fetch_flats_list(config):
|
||||
:return: A list of all available flats.
|
||||
"""
|
||||
flats_list = []
|
||||
for query in config["queries"]:
|
||||
max_entries = config["max_entries"]
|
||||
if max_entries is None:
|
||||
max_entries = 0
|
||||
|
||||
LOGGER.info("Loading flats from query %s.", query)
|
||||
flatboob_output = subprocess.check_output(
|
||||
["../weboob/tools/local_run.sh", "../weboob/scripts/flatboob",
|
||||
"-n", str(max_entries), "-f", "json", "load", query]
|
||||
)
|
||||
query_flats_list = json.loads(flatboob_output)
|
||||
LOGGER.info("Fetched %d flats.", len(query_flats_list))
|
||||
flats_list.extend(query_flats_list)
|
||||
LOGGER.info("Fetched a total of %d flats.", len(flats_list))
|
||||
with WeboobProxy(config) as weboob_proxy:
|
||||
LOGGER.info("Loading flats...")
|
||||
queries = weboob_proxy.build_queries(config["constraints"])
|
||||
housing_posts = []
|
||||
for query in queries:
|
||||
housing_posts.extend(
|
||||
weboob_proxy.query(query, config["max_entries"])
|
||||
)
|
||||
LOGGER.info("Fetched %d flats.", len(housing_posts))
|
||||
|
||||
flats_list = [json.loads(flat) for flat in housing_posts]
|
||||
return flats_list
|
||||
|
||||
|
||||
def fetch_details(flat_id):
|
||||
def fetch_details(config, flat_id):
|
||||
"""
|
||||
Fetch the additional details for a flat using Flatboob / Weboob.
|
||||
|
||||
:param config: A config dict.
|
||||
:param flat_id: ID of the flat to fetch details for.
|
||||
:return: A flat dict with all the available data.
|
||||
"""
|
||||
LOGGER.info("Loading additional details for flat %s.", flat_id)
|
||||
flatboob_output = subprocess.check_output(
|
||||
["../weboob/tools/local_run.sh", "../weboob/scripts/flatboob",
|
||||
"-f", "json", "info", flat_id]
|
||||
)
|
||||
flat_details = json.loads(flatboob_output)
|
||||
LOGGER.info("Fetched details for flat %s.", flat_id)
|
||||
with WeboobProxy(config) as weboob_proxy:
|
||||
LOGGER.info("Loading additional details for flat %s.", flat_id)
|
||||
weboob_output = weboob_proxy.info(flat_id)
|
||||
|
||||
if flat_details:
|
||||
flat_details = flat_details[0]
|
||||
flat_details = json.loads(weboob_output)
|
||||
LOGGER.info("Fetched details for flat %s.", flat_id)
|
||||
|
||||
return flat_details
|
||||
|
||||
|
@ -89,9 +89,10 @@ def first_pass(flats_list, config):
|
||||
|
||||
:param flats_list: A list of flats dict to filter.
|
||||
:param config: A config dict.
|
||||
:return: A tuple of processed flats and purged flats.
|
||||
:return: A tuple of processed flats and ignored flats.
|
||||
"""
|
||||
LOGGER.info("Running first filtering pass.")
|
||||
|
||||
# Handle duplicates based on ids
|
||||
# Just remove them (no merge) as they should be the exact same object.
|
||||
flats_list = duplicates.detect(
|
||||
@ -105,16 +106,16 @@ def first_pass(flats_list, config):
|
||||
flats_list, key="url", merge=True
|
||||
)
|
||||
|
||||
# Add the flatisfy metadata entry
|
||||
# Add the flatisfy metadata entry and prepare the flat objects
|
||||
flats_list = metadata.init(flats_list)
|
||||
# Guess the postal codes
|
||||
flats_list = metadata.guess_postal_code(flats_list, config)
|
||||
# Try to match with stations
|
||||
flats_list = metadata.guess_stations(flats_list, config)
|
||||
# Remove returned housing posts that do not match criteria
|
||||
flats_list, purged_list = refine_with_housing_criteria(flats_list, config)
|
||||
flats_list, ignored_list = refine_with_housing_criteria(flats_list, config)
|
||||
|
||||
return (flats_list, purged_list)
|
||||
return (flats_list, ignored_list)
|
||||
|
||||
|
||||
def second_pass(flats_list, config):
|
||||
@ -130,7 +131,7 @@ def second_pass(flats_list, config):
|
||||
|
||||
:param flats_list: A list of flats dict to filter.
|
||||
:param config: A config dict.
|
||||
:return: A tuple of processed flats and purged flats.
|
||||
:return: A tuple of processed flats and ignored flats.
|
||||
"""
|
||||
LOGGER.info("Running second filtering pass.")
|
||||
# Assumed to run after first pass, so there should be no obvious duplicates
|
||||
@ -148,6 +149,6 @@ def second_pass(flats_list, config):
|
||||
flats_list = metadata.compute_travel_times(flats_list, config)
|
||||
|
||||
# Remove returned housing posts that do not match criteria
|
||||
flats_list, purged_list = refine_with_housing_criteria(flats_list, config)
|
||||
flats_list, ignored_list = refine_with_housing_criteria(flats_list, config)
|
||||
|
||||
return (flats_list, purged_list)
|
||||
return (flats_list, ignored_list)
|
||||
|
@ -5,9 +5,23 @@ Filtering functions to detect and merge duplicates.
|
||||
from __future__ import absolute_import, print_function, unicode_literals
|
||||
|
||||
import collections
|
||||
import logging
|
||||
|
||||
from flatisfy import tools
|
||||
|
||||
LOGGER = logging.getLogger(__name__)
|
||||
|
||||
# Some backends give more infos than others. Here is the precedence we want to
|
||||
# use.
|
||||
BACKENDS_PRECEDENCE = [
|
||||
"seloger",
|
||||
"pap",
|
||||
"leboncoin",
|
||||
"explorimmo",
|
||||
"logicimmo",
|
||||
"entreparticuliers"
|
||||
]
|
||||
|
||||
|
||||
def detect(flats_list, key="id", merge=True):
|
||||
"""
|
||||
@ -27,7 +41,6 @@ def detect(flats_list, key="id", merge=True):
|
||||
|
||||
:return: A deduplicated list of flat dicts.
|
||||
"""
|
||||
# TODO: Keep track of found duplicates?
|
||||
# ``seen`` is a dict mapping aggregating the flats by the deduplication
|
||||
# keys. We basically make buckets of flats for every key value. Flats in
|
||||
# the same bucket should be merged together afterwards.
|
||||
@ -44,6 +57,18 @@ def detect(flats_list, key="id", merge=True):
|
||||
# of the others, to avoid over-deduplication.
|
||||
unique_flats_list.extend(matching_flats)
|
||||
else:
|
||||
# Sort matching flats by backend precedence
|
||||
matching_flats.sort(
|
||||
key=lambda flat: next(
|
||||
i for (i, backend) in enumerate(BACKENDS_PRECEDENCE)
|
||||
if flat["id"].endswith(backend)
|
||||
),
|
||||
reverse=True
|
||||
)
|
||||
|
||||
if len(matching_flats) > 1:
|
||||
LOGGER.info("Found duplicates: %s.",
|
||||
[flat["id"] for flat in matching_flats])
|
||||
# Otherwise, check the policy
|
||||
if merge:
|
||||
# If a merge is requested, do the merge
|
||||
|
@ -20,14 +20,20 @@ LOGGER = logging.getLogger(__name__)
|
||||
def init(flats_list):
|
||||
"""
|
||||
Create a flatisfy key containing a dict of metadata fetched by flatisfy for
|
||||
each flat in the list.
|
||||
each flat in the list. Also perform some basic transform on flat objects to
|
||||
prepare for the metadata fetching.
|
||||
|
||||
:param flats_list: A list of flats dict.
|
||||
:return: The updated list
|
||||
"""
|
||||
for flat in flats_list:
|
||||
# Init flatisfy key
|
||||
if "flatisfy" not in flat:
|
||||
flat["flatisfy"] = {}
|
||||
# Move url key to urls
|
||||
flat["urls"] = [flat["url"]]
|
||||
# Create merged_ids key
|
||||
flat["merged_ids"] = [flat["id"]]
|
||||
return flats_list
|
||||
|
||||
|
||||
@ -298,11 +304,17 @@ def guess_stations(flats_list, config, distance_threshold=1500):
|
||||
# If some stations were already filled in and the result is different,
|
||||
# display some warning to the user
|
||||
if (
|
||||
"matched_stations" in flat["flatisfy"]["matched_stations"] and
|
||||
"matched_stations" in flat["flatisfy"] and
|
||||
(
|
||||
# Do a set comparison, as ordering is not important
|
||||
set(flat["flatisfy"]["matched_stations"]) !=
|
||||
set(good_matched_stations)
|
||||
set([
|
||||
station["name"]
|
||||
for station in flat["flatisfy"]["matched_stations"]
|
||||
]) !=
|
||||
set([
|
||||
station["name"]
|
||||
for station in good_matched_stations
|
||||
])
|
||||
)
|
||||
):
|
||||
LOGGER.warning(
|
||||
|
@ -2,9 +2,12 @@
|
||||
"""
|
||||
This modules defines an SQLAlchemy ORM model for a flat.
|
||||
"""
|
||||
# pylint: disable=invalid-name,too-few-public-methods
|
||||
# pylint: disable=locally-disabled,invalid-name,too-few-public-methods
|
||||
from __future__ import absolute_import, print_function, unicode_literals
|
||||
|
||||
import logging
|
||||
|
||||
import arrow
|
||||
import enum
|
||||
|
||||
from sqlalchemy import Column, DateTime, Enum, Float, String, Text
|
||||
@ -13,15 +16,29 @@ from flatisfy.database.base import BASE
|
||||
from flatisfy.database.types import MagicJSON
|
||||
|
||||
|
||||
LOGGER = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class FlatUtilities(enum.Enum):
|
||||
"""
|
||||
An enum of the possible utilities status for a flat entry.
|
||||
"""
|
||||
included = 10
|
||||
unknown = 0
|
||||
excluded = -10
|
||||
|
||||
|
||||
class FlatStatus(enum.Enum):
|
||||
"""
|
||||
An enum of the possible status for a flat entry.
|
||||
"""
|
||||
purged = -10
|
||||
user_deleted = -100
|
||||
ignored = -10
|
||||
new = 0
|
||||
contacted = 10
|
||||
answer_no = 20
|
||||
answer_yes = 21
|
||||
followed = 10
|
||||
contacted = 20
|
||||
answer_no = 30
|
||||
answer_yes = 31
|
||||
|
||||
|
||||
class Flat(BASE):
|
||||
@ -36,6 +53,7 @@ class Flat(BASE):
|
||||
bedrooms = Column(Float)
|
||||
cost = Column(Float)
|
||||
currency = Column(String)
|
||||
utilities = Column(Enum(FlatUtilities), default=FlatUtilities.unknown)
|
||||
date = Column(DateTime)
|
||||
details = Column(MagicJSON)
|
||||
location = Column(String)
|
||||
@ -45,7 +63,8 @@ class Flat(BASE):
|
||||
station = Column(String)
|
||||
text = Column(Text)
|
||||
title = Column(String)
|
||||
url = Column(String)
|
||||
urls = Column(MagicJSON)
|
||||
merged_ids = Column(MagicJSON)
|
||||
|
||||
# Flatisfy data
|
||||
# TODO: Should be in another table with relationships
|
||||
@ -65,25 +84,45 @@ class Flat(BASE):
|
||||
# Handle flatisfy metadata
|
||||
flat_dict = flat_dict.copy()
|
||||
flat_dict["flatisfy_stations"] = (
|
||||
flat_dict["flatisfy"].get("matched_stations", None)
|
||||
flat_dict["flatisfy"].get("matched_stations", [])
|
||||
)
|
||||
flat_dict["flatisfy_postal_code"] = (
|
||||
flat_dict["flatisfy"].get("postal_code", None)
|
||||
)
|
||||
flat_dict["flatisfy_time_to"] = (
|
||||
flat_dict["flatisfy"].get("time_to", None)
|
||||
flat_dict["flatisfy"].get("time_to", {})
|
||||
)
|
||||
del flat_dict["flatisfy"]
|
||||
|
||||
# Handle utilities field
|
||||
if not isinstance(flat_dict["utilities"], FlatUtilities):
|
||||
if flat_dict["utilities"] == "C.C.":
|
||||
flat_dict["utilities"] = FlatUtilities.included
|
||||
elif flat_dict["utilities"] == "H.C.":
|
||||
flat_dict["utilities"] = FlatUtilities.excluded
|
||||
else:
|
||||
flat_dict["utilities"] = FlatUtilities.unknown
|
||||
|
||||
# Handle status field
|
||||
flat_status = flat_dict.get("status", "new")
|
||||
if not isinstance(flat_status, FlatStatus):
|
||||
try:
|
||||
flat_dict["status"] = getattr(FlatStatus, flat_status)
|
||||
except AttributeError:
|
||||
if "status" in flat_dict:
|
||||
del flat_dict["status"]
|
||||
LOGGER.warn("Unkown flat status %s, ignoring it.",
|
||||
flat_status)
|
||||
|
||||
# Handle date field
|
||||
flat_dict["date"] = None # TODO
|
||||
flat_dict["date"] = arrow.get(flat_dict["date"]).naive
|
||||
|
||||
flat_object = Flat()
|
||||
flat_object.__dict__.update(flat_dict)
|
||||
return flat_object
|
||||
|
||||
def __repr__(self):
|
||||
return "<Flat(id=%s, url=%s)>" % (self.id, self.url)
|
||||
return "<Flat(id=%s, urls=%s)>" % (self.id, self.urls)
|
||||
|
||||
|
||||
def json_api_repr(self):
|
||||
@ -96,6 +135,9 @@ class Flat(BASE):
|
||||
for k, v in self.__dict__.items()
|
||||
if not k.startswith("_")
|
||||
}
|
||||
flat_repr["status"] = str(flat_repr["status"])
|
||||
if isinstance(flat_repr["status"], FlatStatus):
|
||||
flat_repr["status"] = flat_repr["status"].name
|
||||
if isinstance(flat_repr["utilities"], FlatUtilities):
|
||||
flat_repr["utilities"] = flat_repr["utilities"].name
|
||||
|
||||
return flat_repr
|
||||
|
@ -8,6 +8,7 @@ from __future__ import (
|
||||
)
|
||||
|
||||
import datetime
|
||||
import itertools
|
||||
import json
|
||||
import logging
|
||||
import math
|
||||
@ -23,6 +24,16 @@ LOGGER = logging.getLogger(__name__)
|
||||
NAVITIA_ENDPOINT = "https://api.navitia.io/v1/coverage/fr-idf/journeys"
|
||||
|
||||
|
||||
class DateAwareJSONEncoder(json.JSONEncoder):
|
||||
"""
|
||||
Extend the default JSON encoder to serialize datetimes to iso strings.
|
||||
"""
|
||||
def default(self, o): # pylint: disable=locally-disabled,E0202
|
||||
if isinstance(o, (datetime.date, datetime.datetime)):
|
||||
return o.isoformat()
|
||||
return json.JSONEncoder.default(self, o)
|
||||
|
||||
|
||||
def pretty_json(data):
|
||||
"""
|
||||
Pretty JSON output.
|
||||
@ -38,10 +49,25 @@ def pretty_json(data):
|
||||
"toto": "ok"
|
||||
}
|
||||
"""
|
||||
return json.dumps(data, indent=4, separators=(',', ': '),
|
||||
return json.dumps(data, cls=DateAwareJSONEncoder,
|
||||
indent=4, separators=(',', ': '),
|
||||
sort_keys=True)
|
||||
|
||||
|
||||
def batch(iterable, size):
|
||||
"""
|
||||
Get items from a sequence a batch at a time.
|
||||
|
||||
:param iterable: The iterable to get the items from.
|
||||
:param size: The size of the batches.
|
||||
:return: A new iterable.
|
||||
"""
|
||||
sourceiter = iter(iterable)
|
||||
while True:
|
||||
batchiter = itertools.islice(sourceiter, size)
|
||||
yield itertools.chain([batchiter.next()], batchiter)
|
||||
|
||||
|
||||
def is_within_interval(value, min_value=None, max_value=None):
|
||||
"""
|
||||
Check whether a variable is within a given interval. Assumes the value is
|
||||
@ -142,7 +168,7 @@ def distance(gps1, gps2):
|
||||
lat2 = math.radians(gps2[0])
|
||||
long2 = math.radians(gps2[1])
|
||||
|
||||
# pylint: disable=invalid-name
|
||||
# pylint: disable=locally-disabled,invalid-name
|
||||
a = (
|
||||
math.sin((lat2 - lat1) / 2.0)**2 +
|
||||
math.cos(lat1) * math.cos(lat2) * math.sin((long2 - long1) / 2.0)**2
|
||||
@ -175,22 +201,30 @@ def merge_dicts(*args):
|
||||
"""
|
||||
if len(args) == 1:
|
||||
return args[0]
|
||||
else:
|
||||
flat1, flat2 = args[:2]
|
||||
merged_flat = {}
|
||||
for k, value2 in flat2.items():
|
||||
value1 = flat1.get(k, None)
|
||||
if value1 is None:
|
||||
# flat1 has empty matching field, just keep the flat2 field
|
||||
merged_flat[k] = value2
|
||||
elif value2 is None:
|
||||
# flat2 field is empty, just keep the flat1 field
|
||||
merged_flat[k] = value1
|
||||
else:
|
||||
# Any other case, we should merge
|
||||
# TODO: Do the merge
|
||||
merged_flat[k] = value1
|
||||
return merge_dicts(merged_flat, *args[2:])
|
||||
|
||||
flat1, flat2 = args[:2] # pylint: disable=locally-disabled,unbalanced-tuple-unpacking,line-too-long
|
||||
merged_flat = {}
|
||||
for k, value2 in flat2.items():
|
||||
value1 = flat1.get(k, None)
|
||||
|
||||
if k in ["urls", "merged_ids"]:
|
||||
# Handle special fields separately
|
||||
merged_flat[k] = list(set(value2 + value1))
|
||||
continue
|
||||
|
||||
if not value1:
|
||||
# flat1 has empty matching field, just keep the flat2 field
|
||||
merged_flat[k] = value2
|
||||
elif not value2:
|
||||
# flat2 field is empty, just keep the flat1 field
|
||||
merged_flat[k] = value1
|
||||
else:
|
||||
# Any other case, we should keep the value of the more recent flat
|
||||
# dict (the one most at right in arguments)
|
||||
merged_flat[k] = value2
|
||||
for k in [key for key in flat1.keys() if key not in flat2.keys()]:
|
||||
merged_flat[k] = flat1[k]
|
||||
return merge_dicts(merged_flat, *args[2:])
|
||||
|
||||
|
||||
def get_travel_time_between(latlng_from, latlng_to, config):
|
||||
|
@ -6,15 +6,30 @@ from __future__ import (
|
||||
absolute_import, division, print_function, unicode_literals
|
||||
)
|
||||
|
||||
import functools
|
||||
import json
|
||||
import logging
|
||||
import os
|
||||
|
||||
import bottle
|
||||
import canister
|
||||
|
||||
from flatisfy import database
|
||||
from flatisfy.tools import DateAwareJSONEncoder
|
||||
from flatisfy.web.routes import api as api_routes
|
||||
from flatisfy.web.configplugin import ConfigPlugin
|
||||
from flatisfy.web.dbplugin import DatabasePlugin
|
||||
|
||||
|
||||
class QuietWSGIRefServer(bottle.WSGIRefServer):
|
||||
"""
|
||||
Quiet implementation of Bottle built-in WSGIRefServer, as `Canister` is
|
||||
handling the logging through standard Python logging.
|
||||
"""
|
||||
# pylint: disable=locally-disabled,too-few-public-methods
|
||||
quiet = True
|
||||
|
||||
|
||||
def _serve_static_file(filename):
|
||||
"""
|
||||
Helper function to serve static file.
|
||||
@ -38,11 +53,31 @@ def get_app(config):
|
||||
|
||||
app = bottle.default_app()
|
||||
app.install(DatabasePlugin(get_session))
|
||||
app.install(ConfigPlugin(config))
|
||||
app.config.setdefault("canister.log_level", logging.root.level)
|
||||
app.config.setdefault("canister.log_path", None)
|
||||
app.config.setdefault("canister.debug", False)
|
||||
app.install(canister.Canister())
|
||||
# Use DateAwareJSONEncoder to dump JSON strings
|
||||
# From http://stackoverflow.com/questions/21282040/bottle-framework-how-to-return-datetime-in-json-response#comment55718456_21282666. pylint: disable=locally-disabled,line-too-long
|
||||
bottle.install(
|
||||
bottle.JSONPlugin(
|
||||
json_dumps=functools.partial(json.dumps, cls=DateAwareJSONEncoder)
|
||||
)
|
||||
)
|
||||
|
||||
# API v1 routes
|
||||
app.route("/api/v1/", "GET", api_routes.index_v1)
|
||||
|
||||
app.route("/api/v1/time_to/places", "GET", api_routes.time_to_places_v1)
|
||||
|
||||
app.route("/api/v1/flats", "GET", api_routes.flats_v1)
|
||||
app.route("/api/v1/flats/status/:status", "GET",
|
||||
api_routes.flats_by_status_v1)
|
||||
|
||||
app.route("/api/v1/flat/:flat_id", "GET", api_routes.flat_v1)
|
||||
app.route("/api/v1/flat/:flat_id/status", "POST",
|
||||
api_routes.update_flat_status_v1)
|
||||
|
||||
# Index
|
||||
app.route("/", "GET", lambda: _serve_static_file("index.html"))
|
||||
|
72
flatisfy/web/configplugin.py
Normal file
72
flatisfy/web/configplugin.py
Normal file
@ -0,0 +1,72 @@
|
||||
# coding: utf-8
|
||||
"""
|
||||
This module contains a Bottle plugin to pass the config argument to any route
|
||||
which needs it.
|
||||
|
||||
This module is heavily based on code from
|
||||
[Bottle-SQLAlchemy](https://github.com/iurisilvio/bottle-sqlalchemy) which is
|
||||
licensed under MIT license.
|
||||
"""
|
||||
from __future__ import (
|
||||
absolute_import, division, print_function, unicode_literals
|
||||
)
|
||||
|
||||
import functools
|
||||
import inspect
|
||||
|
||||
import bottle
|
||||
|
||||
|
||||
class ConfigPlugin(object):
|
||||
"""
|
||||
A Bottle plugin to automatically pass the config object to the routes
|
||||
specifying they need it.
|
||||
"""
|
||||
name = 'config'
|
||||
api = 2
|
||||
KEYWORD = "config"
|
||||
|
||||
def __init__(self, config):
|
||||
"""
|
||||
:param config: The config object to pass.
|
||||
"""
|
||||
self.config = config
|
||||
|
||||
def setup(self, app): # pylint: disable=no-self-use
|
||||
"""
|
||||
Make sure that other installed plugins don't affect the same
|
||||
keyword argument and check if metadata is available.
|
||||
"""
|
||||
for other in app.plugins:
|
||||
if not isinstance(other, ConfigPlugin):
|
||||
continue
|
||||
else:
|
||||
raise bottle.PluginError(
|
||||
"Found another conflicting Config plugin."
|
||||
)
|
||||
|
||||
def apply(self, callback, route):
|
||||
"""
|
||||
Method called on route invocation. Should apply some transformations to
|
||||
the route prior to returing it.
|
||||
|
||||
We check the presence of ``self.KEYWORD`` in the route signature and
|
||||
replace the route callback by a partial invocation where we replaced
|
||||
this argument by a valid config object.
|
||||
"""
|
||||
# Check whether the route needs a valid db session or not.
|
||||
try:
|
||||
callback_args = inspect.signature(route.callback).parameters
|
||||
except AttributeError:
|
||||
# inspect.signature does not exist on older Python
|
||||
callback_args = inspect.getargspec(route.callback).args
|
||||
|
||||
if self.KEYWORD not in callback_args:
|
||||
# If no need for a db session, call the route callback
|
||||
return callback
|
||||
kwargs = {}
|
||||
kwargs[self.KEYWORD] = self.config
|
||||
return functools.partial(callback, **kwargs)
|
||||
|
||||
|
||||
Plugin = ConfigPlugin
|
@ -28,13 +28,12 @@ class DatabasePlugin(object):
|
||||
|
||||
def __init__(self, get_session):
|
||||
"""
|
||||
:param keyword: Keyword used to inject session database in a route
|
||||
:param create_session: SQLAlchemy session maker created with the
|
||||
'sessionmaker' function. Will create its own if undefined.
|
||||
"""
|
||||
self.get_session = get_session
|
||||
|
||||
def setup(self, app): # pylint: disable-no-self-use
|
||||
def setup(self, app): # pylint: disable=no-self-use
|
||||
"""
|
||||
Make sure that other installed plugins don't affect the same
|
||||
keyword argument and check if metadata is available.
|
||||
@ -67,11 +66,15 @@ class DatabasePlugin(object):
|
||||
# If no need for a db session, call the route callback
|
||||
return callback
|
||||
else:
|
||||
# Otherwise, we get a db session and pass it to the callback
|
||||
with self.get_session() as session:
|
||||
kwargs = {}
|
||||
kwargs[self.KEYWORD] = session
|
||||
return functools.partial(callback, **kwargs)
|
||||
def wrapper(*args, **kwargs):
|
||||
"""
|
||||
Wrap the callback in a call to get_session.
|
||||
"""
|
||||
with self.get_session() as session:
|
||||
# Get a db session and pass it to the callback
|
||||
kwargs[self.KEYWORD] = session
|
||||
return callback(*args, **kwargs)
|
||||
return wrapper
|
||||
|
||||
|
||||
Plugin = DatabasePlugin
|
||||
|
63
flatisfy/web/js_src/api/index.js
Normal file
63
flatisfy/web/js_src/api/index.js
Normal file
@ -0,0 +1,63 @@
|
||||
import moment from 'moment'
|
||||
|
||||
require('es6-promise').polyfill()
|
||||
require('isomorphic-fetch')
|
||||
|
||||
export const getFlats = function (callback) {
|
||||
fetch('/api/v1/flats')
|
||||
.then(function (response) {
|
||||
return response.json()
|
||||
}).then(function (json) {
|
||||
const flats = json.data
|
||||
flats.map(flat => {
|
||||
if (flat.date) {
|
||||
flat.date = moment(flat.date)
|
||||
}
|
||||
return flat
|
||||
})
|
||||
callback(flats)
|
||||
}).catch(function (ex) {
|
||||
console.error('Unable to parse flats: ' + ex)
|
||||
})
|
||||
}
|
||||
|
||||
export const getFlat = function (flatId, callback) {
|
||||
fetch('/api/v1/flat/' + encodeURIComponent(flatId))
|
||||
.then(function (response) {
|
||||
return response.json()
|
||||
}).then(function (json) {
|
||||
const flat = json.data
|
||||
if (flat.date) {
|
||||
flat.date = moment(flat.date)
|
||||
}
|
||||
callback(json.data)
|
||||
}).catch(function (ex) {
|
||||
console.error('Unable to parse flats: ' + ex)
|
||||
})
|
||||
}
|
||||
|
||||
export const updateFlatStatus = function (flatId, newStatus, callback) {
|
||||
fetch(
|
||||
'/api/v1/flat/' + encodeURIComponent(flatId) + '/status',
|
||||
{
|
||||
method: 'POST',
|
||||
headers: {
|
||||
'Content-Type': 'application/json'
|
||||
},
|
||||
body: JSON.stringify({
|
||||
status: newStatus
|
||||
})
|
||||
}
|
||||
).then(callback)
|
||||
}
|
||||
|
||||
export const getTimeToPlaces = function (callback) {
|
||||
fetch('/api/v1/time_to/places')
|
||||
.then(function (response) {
|
||||
return response.json()
|
||||
}).then(function (json) {
|
||||
callback(json.data)
|
||||
}).catch(function (ex) {
|
||||
console.error('Unable to fetch time to places: ' + ex)
|
||||
})
|
||||
}
|
76
flatisfy/web/js_src/components/app.vue
Normal file
76
flatisfy/web/js_src/components/app.vue
Normal file
@ -0,0 +1,76 @@
|
||||
<template>
|
||||
<div>
|
||||
<h1><router-link :to="{name: 'home'}">Flatisfy</router-link></h1>
|
||||
<nav>
|
||||
<ul>
|
||||
<li><router-link :to="{name: 'home'}">{{ $t("menu.available_flats") }}</router-link></li>
|
||||
<li><router-link :to="{name: 'followed'}">{{ $t("menu.followed_flats") }}</router-link></li>
|
||||
<li><router-link :to="{name: 'ignored'}">{{ $t("menu.ignored_flats") }}</router-link></li>
|
||||
<li><router-link :to="{name: 'user_deleted'}">{{ $t("menu.user_deleted_flats") }}</router-link></li>
|
||||
</ul>
|
||||
</nav>
|
||||
<router-view></router-view>
|
||||
</div>
|
||||
</template>
|
||||
|
||||
<style>
|
||||
body {
|
||||
margin: 0 auto;
|
||||
max-width: 75em;
|
||||
font-family: "Helvetica", "Arial", sans-serif;
|
||||
line-height: 1.5;
|
||||
padding: 4em 1em;
|
||||
padding-top: 1em;
|
||||
color: #555;
|
||||
}
|
||||
|
||||
h1 {
|
||||
text-align: center;
|
||||
}
|
||||
|
||||
h1,
|
||||
h2,
|
||||
strong,
|
||||
th {
|
||||
color: #333;
|
||||
}
|
||||
|
||||
table {
|
||||
border-collapse: collapse;
|
||||
margin: 1em;
|
||||
width: calc(100% - 2em);
|
||||
text-align: center;
|
||||
}
|
||||
|
||||
th, td {
|
||||
padding: 1em;
|
||||
border: 1px solid black;
|
||||
}
|
||||
|
||||
tbody>tr:hover {
|
||||
background-color: #DDD;
|
||||
}
|
||||
</style>
|
||||
|
||||
<style scoped>
|
||||
h1 a {
|
||||
color: inherit;
|
||||
text-decoration: none;
|
||||
}
|
||||
|
||||
nav {
|
||||
text-align: center;
|
||||
}
|
||||
|
||||
nav ul {
|
||||
list-style-position: inside;
|
||||
padding: 0;
|
||||
}
|
||||
|
||||
nav ul li {
|
||||
list-style: none;
|
||||
display: inline-block;
|
||||
padding-left: 1em;
|
||||
padding-right: 1em;
|
||||
}
|
||||
</style>
|
103
flatisfy/web/js_src/components/flatsmap.vue
Normal file
103
flatisfy/web/js_src/components/flatsmap.vue
Normal file
@ -0,0 +1,103 @@
|
||||
<template lang="html">
|
||||
<div class="full">
|
||||
<v-map :zoom="zoom.defaultZoom" :center="center" :bounds="bounds" :min-zoom="zoom.minZoom" :max-zoom="zoom.maxZoom">
|
||||
<v-tilelayer :url="tiles.url" :attribution="tiles.attribution"></v-tilelayer>
|
||||
<template v-for="marker in flats">
|
||||
<v-marker :lat-lng="{ lat: marker.gps[0], lng: marker.gps[1] }" :icon="icons.flat">
|
||||
<v-popup :content="marker.content"></v-popup>
|
||||
</v-marker>
|
||||
</template>
|
||||
<template v-for="(place_gps, place_name) in places">
|
||||
<v-marker :lat-lng="{ lat: place_gps[0], lng: place_gps[1] }" :icon="icons.place">
|
||||
<v-tooltip :content="place_name"></v-tooltip>
|
||||
</v-marker>
|
||||
</template>
|
||||
</v-map>
|
||||
</div>
|
||||
</template>
|
||||
|
||||
<script>
|
||||