access time limit

netluca15 years ago

Hello everyone, I wanted to ask the community if someone has found a solution to guarantee the tracking service and respect the privacy guarantor.
The legislation requires that the employee should not be checked outside of working hours.
At the same time for the safety of the vehicle it is good that the data are constantly collected.
So I was thinking of such a solution.
Maybe we can set a working time for the user so that it is possible to obscure or block access during non-working hours.

Tony Shelver5 years ago

The problem with that is that the user can query historical tracking data. You would have to block any access after hours data, excepting maybe on an exception / overrride / emergency basis.

I would read the legislation carefully, it seems overly restrictive to me. After all, the employee is using a company resource, not his own transport.
We have built an entire system fed from Traccar (in testing) and our existing tracking system to provide precisely that type information to managers, among other things.
For example, they can detect after hours abuse of vehicles, and / or cost out after hours use of vehicles. They also use this data to confirm employee time and overtime claims on job sites.

netluca15 years ago

hi @tshelver i think we are talking about the same solution that you suggest.
The tracer continues to record data and the user has queries stuck outside of working hours.
As you say it seems that you have implemented it we can compare we would be interested in this function.
my email: [REMOVED]

Tony Shelver5 years ago

Our project is took a fair bit of effort to develop. We initially had an attempt using mostly SQL Server and MS tools, but had cost, performance and flexibility issues. We did a redesign as a result of several lessons learned, based on Ubuntu, Postgresql 11, and Python 3.

We are pulling data from a legacy tracking system (Oracle DB) into our custom Postgresql database. I am currently finalizing the Traccar integration after playing around with it for a while. It should be in beta by end January.

Among other lessons, we found that SQLServer GIS functions suck in terms of performance and coverage.
We found out never to trust a proprietary vendor: our current system uses way-out-of-date versions of both Java and Oracle, and the vendor has still not provided a replacement for their reporting module which uses Adobe Flash, despite an upgrade last year.
We also looked at a few tracking systems in a fair degree of depth for input, after experiencing some severe constraints with our existing system. For example, we found that for trips (routes) and stop reports, one system used only the vehicle flame in / flame out signal to signal trip stop / start. Another seems to use only device stationary time to generate this.
We found the most accurate way was to combine both in the logic.

Our setup is quite specific to our requirements, although the basic design is very flexible.
We use Postgresql with the PostGIS extension. All incoming lat/long positional data is converted to PostGIS geography points. We make extensive use of PostGIS for all area / distance / position / cached address functionality.
Our DB structure can store all history (such as driver / vehicle, vehicle / device changes and a lot more), and is flexible enough for us to define custom relationships on the fly.
All our reports (for example, route, stop, event, area, log book, vehicle mileage / time billing reports are extracted through Postgres functions, and then formatted and distributed in Python.
We found a lot of time savings doing this: we have 1 Postgres function to extract data for routes, stops, log books, area reports and billing reports, where all the heavy lifting and most of the complex logic is done. This function is then called by several Python modules that generate the final reports.

Currently we distribute data in fully formatted Excel spreadsheets (uses the python openpyxl module to generate and format), as most users seem to find it easier to see what's going on with multiple vehicles over the selected period using Excel instead of the on-screen report.

Telegram
For now, reports requests and vehicle stats / position queries are made via a Telegram bot, although the backend can be reused for an upcoming browser / smartphone app.

Other
One of the reasons for the extensive use of Postgres functions (apart from performance) is that going forward, we will be using a GraphQL API, and this is a doddle to generate using Postgraphile, which automatically generates a GraphQL Schema / API / server from Postgres. This will allow reuse of most of the Postgres functions we have already created.

Once Traccar is fully integrated, the next move is to move the entire system to the cloud in addition to or in place of our own servers, to allow customers and vendors their own system and UI, and to focus more on custom development for specific customer requirements.