first commit
This commit is contained in:
commit
9e6d0a6911
7
.cursorignore
Normal file
7
.cursorignore
Normal file
@ -0,0 +1,7 @@
|
||||
venv/
|
||||
__pycache__/
|
||||
*.pyc
|
||||
*.pyo
|
||||
*.pyd
|
||||
*.pyw
|
||||
*.pyz
|
||||
5
.gitignore
vendored
Normal file
5
.gitignore
vendored
Normal file
@ -0,0 +1,5 @@
|
||||
venv/
|
||||
__pycache__/
|
||||
*.pyc
|
||||
*.log
|
||||
*.env
|
||||
216
README.md
Normal file
216
README.md
Normal file
@ -0,0 +1,216 @@
|
||||
# Employee Workstation Activity Tracking System
|
||||
|
||||
A comprehensive system for tracking user activity on Windows workstations and reporting it to a central server. The system logs user logon events and monitors active/inactive periods, providing detailed reports through a web dashboard.
|
||||
|
||||
## Overview
|
||||
|
||||
This system consists of two main components:
|
||||
|
||||
1. **Client Agent**: A PowerShell script that runs on Windows workstations to detect user idle time and report activity state changes.
|
||||
2. **Server Application**: A Flask-based web application that receives activity reports, stores them in a database, and provides reporting capabilities.
|
||||
|
||||
The system tracks when users are "working" or "stopped" based on a 5-minute inactivity threshold, allowing organizations to measure effective working time.
|
||||
|
||||
## Client-Side Setup
|
||||
|
||||
### Prerequisites
|
||||
|
||||
* Windows 10/11 workstations
|
||||
* PowerShell 3.0 or higher
|
||||
* Administrative access (for setting up scheduled tasks)
|
||||
|
||||
### Installation
|
||||
|
||||
1. Copy the `report.ps1` script to a secure location on the workstation (e.g., `C:\Scripts\`)
|
||||
|
||||
2. Copy `config.env` to the same location as the PowerShell script, or set the following environment variables:
|
||||
- `API_ENDPOINT`: URL of the server's reporting endpoint
|
||||
- `IDLE_THRESHOLD_MINUTES`: Inactivity threshold in minutes (default: 5)
|
||||
- `POLL_INTERVAL_SECONDS`: How often to check for idle state (default: 60)
|
||||
|
||||
3. Run the `schedule_task.ps1` script with administrative privileges to create the scheduled task:
|
||||
|
||||
```powershell
|
||||
.\schedule_task.ps1 -ScriptPath "C:\Scripts\report.ps1"
|
||||
```
|
||||
|
||||
The script will automatically run when users log on to the workstation.
|
||||
|
||||
## Server-Side Setup
|
||||
|
||||
### Prerequisites
|
||||
|
||||
* Python 3.9 or higher
|
||||
* SQLite (development) or PostgreSQL (production)
|
||||
* pip (Python package manager)
|
||||
|
||||
### Installation
|
||||
|
||||
1. Clone this repository to your server
|
||||
|
||||
2. Create and activate a virtual environment:
|
||||
|
||||
```bash
|
||||
python -m venv venv
|
||||
source venv/bin/activate # On Windows: venv\Scripts\activate
|
||||
```
|
||||
|
||||
3. Install the required packages:
|
||||
|
||||
```bash
|
||||
pip install -r requirements.txt
|
||||
```
|
||||
|
||||
4. Create a configuration file:
|
||||
|
||||
```bash
|
||||
cp config.env.example config.env
|
||||
```
|
||||
|
||||
5. Edit `config.env` with your specific settings:
|
||||
- `DATABASE_URI`: Connection string for your database
|
||||
- `API_ENDPOINT`: URL for the reporting endpoint
|
||||
- Other configuration parameters
|
||||
|
||||
6. Set up the database:
|
||||
|
||||
```bash
|
||||
# For SQLite
|
||||
sqlite3 work_events.db < create_db.sql
|
||||
|
||||
# For PostgreSQL
|
||||
psql -U username -d database_name -f create_db.sql
|
||||
```
|
||||
|
||||
7. Run the application:
|
||||
|
||||
```bash
|
||||
# Development
|
||||
python run.py
|
||||
|
||||
# Production (with Gunicorn)
|
||||
gunicorn -w 4 -b 0.0.0.0:5000 "app:create_app()"
|
||||
```
|
||||
|
||||
## System Architecture
|
||||
|
||||
### Client Agent
|
||||
|
||||
The PowerShell script:
|
||||
- Is launched at user logon via Task Scheduler
|
||||
- Monitors user idle time using the `quser` command or Win32 API calls
|
||||
- Reports state changes to the server via HTTP POST requests
|
||||
- Uses the built-in `Invoke-RestMethod` cmdlet for API communication
|
||||
- Implements error handling and local logging
|
||||
- Supports configuration via environment variables or config.env file
|
||||
|
||||
### Server Application
|
||||
|
||||
The Flask application:
|
||||
- Uses a modular structure with separate components for models, views, and API endpoints
|
||||
- Exposes a RESTful API endpoint at `/api/report`
|
||||
- Stores activity events in a relational database
|
||||
- Provides reporting functionality via SQL aggregation
|
||||
- Includes a web dashboard for visualizing activity data
|
||||
- Supports SQLite for development and PostgreSQL for production
|
||||
|
||||
#### Application Structure
|
||||
|
||||
```
|
||||
user_work_tracking/
|
||||
├── app/ # Application package
|
||||
│ ├── api/ # API endpoints
|
||||
│ │ ├── events.py # Event reporting endpoints
|
||||
│ │ └── reports.py # Data reporting endpoints
|
||||
│ ├── utils/ # Utility functions
|
||||
│ │ ├── formatting.py # Data formatting functions
|
||||
│ │ └── queries.py # SQL query functions
|
||||
│ ├── views/ # Web views
|
||||
│ │ └── dashboard.py # Dashboard views
|
||||
│ ├── models.py # Database models
|
||||
│ └── errors.py # Error handlers
|
||||
├── instance/ # Instance-specific data
|
||||
├── static/ # Static files
|
||||
│ ├── css/ # CSS stylesheets
|
||||
│ └── js/ # JavaScript files
|
||||
├── templates/ # HTML templates
|
||||
├── create_db.sql # Database schema creation script
|
||||
├── config.env # Configuration file
|
||||
├── report.ps1 # Client-side PowerShell script
|
||||
├── schedule_task.ps1 # Task scheduler setup script
|
||||
├── requirements.txt # Python dependencies
|
||||
└── run.py # Application entry point
|
||||
```
|
||||
|
||||
## API Reference
|
||||
|
||||
### Report Activity Endpoint
|
||||
|
||||
**URL**: `/api/report`
|
||||
**Method**: `POST`
|
||||
**Auth**: None (LAN-restricted)
|
||||
|
||||
**Request Body**:
|
||||
```json
|
||||
{
|
||||
"user": "username",
|
||||
"state": "working",
|
||||
"ts": "2023-07-08T12:30:45Z"
|
||||
}
|
||||
```
|
||||
|
||||
Fields:
|
||||
- `user`: Windows username
|
||||
- `state`: Either "working" or "stopped"
|
||||
- `ts`: ISO 8601 timestamp (optional, defaults to server time)
|
||||
|
||||
**Success Response**:
|
||||
```json
|
||||
{
|
||||
"success": true
|
||||
}
|
||||
```
|
||||
|
||||
## Reporting
|
||||
|
||||
Access the dashboard by navigating to `http://your-server-address:5000/` in a web browser.
|
||||
|
||||
The system provides:
|
||||
- Daily, weekly, and monthly summaries of working time
|
||||
- First login time tracking
|
||||
- User activity breakdowns with detailed logs
|
||||
- Interactive date navigation
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
### Client Issues
|
||||
|
||||
- Check the log file at `%USERPROFILE%\AppData\Local\Temp\user_work_tracking_client.log`
|
||||
- Verify the scheduled task is configured correctly in Task Scheduler
|
||||
- Test the script manually: `.\report.ps1`
|
||||
- Make sure the `config.env` file or environment variables are correctly set
|
||||
|
||||
### Server Issues
|
||||
|
||||
- Check the application logs in the `instance` directory
|
||||
- Verify database connectivity
|
||||
- Ensure the server is accessible from client workstations
|
||||
- When using PostgreSQL, ensure column names are properly quoted (especially "user" which is a reserved keyword)
|
||||
- If users appear incorrectly in the dashboard, check the SQL queries to ensure proper schema specification (e.g., "public.work_events")
|
||||
|
||||
### PostgreSQL-Specific Issues
|
||||
|
||||
- **Reserved Keywords**: Be careful with reserved keywords like "user". Always quote them with double quotes in SQL queries.
|
||||
- **Schema Specification**: Explicitly reference the schema (e.g., "public.work_events" instead of just "work_events") to avoid potential naming conflicts.
|
||||
- **Connection String**: Ensure the database URI in config.env uses the correct username, password, and database name.
|
||||
- **Database vs Login Role**: The PostgreSQL username used to connect to the database is separate from the usernames stored in the "user" column of work_events table.
|
||||
|
||||
## Security Considerations
|
||||
|
||||
- The system is designed for internal LAN use only and lacks authentication
|
||||
- Ensure the server is not exposed to the public internet
|
||||
- Consider implementing network-level access controls
|
||||
|
||||
## License
|
||||
|
||||
This project is licensed under the MIT License - see the LICENSE file for details.
|
||||
835
app.py
Normal file
835
app.py
Normal file
@ -0,0 +1,835 @@
|
||||
"""
|
||||
Employee Workstation Activity Tracking - Flask API Server
|
||||
|
||||
This Flask application provides a REST API for receiving and storing user activity
|
||||
events from client workstations. It exposes endpoints for reporting activity state
|
||||
changes and retrieving aggregated reports.
|
||||
"""
|
||||
import os
|
||||
import logging
|
||||
from logging.handlers import RotatingFileHandler
|
||||
from datetime import datetime, timedelta
|
||||
from dotenv import load_dotenv # Import load_dotenv
|
||||
from flask import Flask, request, jsonify, render_template
|
||||
from flask_sqlalchemy import SQLAlchemy
|
||||
from sqlalchemy.exc import SQLAlchemyError
|
||||
from sqlalchemy import text, func, case, cast, Integer
|
||||
import sys
|
||||
|
||||
# Load environment variables from config.env file
|
||||
load_dotenv() # Try .env first
|
||||
config_env_path = os.path.join(os.path.dirname(__file__), 'config.env')
|
||||
if os.path.exists(config_env_path):
|
||||
load_dotenv(config_env_path)
|
||||
print(f"Loaded environment variables from {config_env_path}")
|
||||
else:
|
||||
print(f"Warning: config.env file not found at {config_env_path}")
|
||||
|
||||
# Print all DATABASE* environment variables for debugging (masked)
|
||||
for key, value in os.environ.items():
|
||||
if key.startswith("DATABASE"):
|
||||
masked_value = value
|
||||
if "@" in value:
|
||||
parts = value.split("@")
|
||||
masked_value = "****@" + parts[1]
|
||||
print(f"{key}: {masked_value}")
|
||||
|
||||
# Initialize Flask app
|
||||
app = Flask(__name__, instance_relative_config=True)
|
||||
|
||||
# Load configuration
|
||||
# In production, use environment variables or .env file
|
||||
app.config.from_mapping(
|
||||
SECRET_KEY=os.environ.get('SECRET_KEY', 'dev'),
|
||||
SQLALCHEMY_DATABASE_URI=os.environ.get('DATABASE_URI', os.environ.get('DATABASE_URL')),
|
||||
SQLALCHEMY_TRACK_MODIFICATIONS=False
|
||||
)
|
||||
|
||||
# Ensure DATABASE_URI is set
|
||||
if not app.config['SQLALCHEMY_DATABASE_URI']:
|
||||
raise ValueError("DATABASE_URI or DATABASE_URL environment variable must be set for production mode")
|
||||
|
||||
# Ensure the instance folder exists
|
||||
try:
|
||||
os.makedirs(app.instance_path)
|
||||
except OSError:
|
||||
pass
|
||||
|
||||
# Configure logging
|
||||
log_formatter = logging.Formatter('%(asctime)s - %(levelname)s - %(message)s')
|
||||
log_handler = RotatingFileHandler(
|
||||
os.path.join(app.instance_path, 'server.log'),
|
||||
maxBytes=1024 * 1024 * 5, # 5 MB
|
||||
backupCount=5
|
||||
)
|
||||
log_handler.setFormatter(log_formatter)
|
||||
log_handler.setLevel(logging.INFO)
|
||||
|
||||
if not app.debug: # Avoid duplicate logs in debug mode if Werkzeug logger is also active
|
||||
app.logger.addHandler(log_handler)
|
||||
app.logger.setLevel(logging.INFO)
|
||||
|
||||
app.logger.info('Flask application starting up...') # Log startup
|
||||
|
||||
# Initialize database
|
||||
db = SQLAlchemy()
|
||||
db.init_app(app)
|
||||
|
||||
# Define database models
|
||||
class WorkEvent(db.Model):
|
||||
"""
|
||||
Represents a user activity event with state transitions (working/stopped).
|
||||
"""
|
||||
__tablename__ = 'work_events'
|
||||
|
||||
id = db.Column(db.Integer, primary_key=True)
|
||||
user = db.Column(db.String(100), nullable=False, index=True)
|
||||
state = db.Column(db.String(10), nullable=False) # 'working' or 'stopped'
|
||||
ts = db.Column(db.DateTime, nullable=False,
|
||||
server_default=db.func.current_timestamp(),
|
||||
index=True)
|
||||
|
||||
def __repr__(self):
|
||||
return f"<WorkEvent(user='{self.user}', state='{self.state}', ts='{self.ts}')>"
|
||||
|
||||
def to_dict(self):
|
||||
"""Convert model to dictionary for API responses"""
|
||||
return {
|
||||
'id': self.id,
|
||||
'user': self.user,
|
||||
'state': self.state,
|
||||
'ts': self.ts.isoformat() if self.ts else None
|
||||
}
|
||||
|
||||
# Print model metadata for debugging
|
||||
app.logger.info(f"WorkEvent __tablename__: {WorkEvent.__tablename__}")
|
||||
app.logger.info(f"WorkEvent columns: {[column.name for column in WorkEvent.__table__.columns]}")
|
||||
try:
|
||||
# Check if there are any attribute mappers or event listeners
|
||||
app.logger.info(f"WorkEvent attribute names: {dir(WorkEvent)}")
|
||||
except Exception as e:
|
||||
app.logger.error(f"Error inspecting WorkEvent class: {str(e)}")
|
||||
|
||||
# API Routes
|
||||
@app.route('/api/report', methods=['POST'])
|
||||
def report_event():
|
||||
"""
|
||||
Endpoint for clients to report activity state changes.
|
||||
|
||||
Expected JSON payload:
|
||||
{
|
||||
"user": "username",
|
||||
"state": "working|stopped",
|
||||
"ts": "2023-07-08T12:30:45Z" (optional, ISO 8601)
|
||||
}
|
||||
"""
|
||||
data = request.get_json()
|
||||
app.logger.info(f"Received report request: {data}") # Log request
|
||||
|
||||
if not data or 'user' not in data or 'state' not in data:
|
||||
app.logger.warning("Invalid report request payload.")
|
||||
return jsonify({
|
||||
'success': False,
|
||||
'message': 'Missing required fields: user, state'
|
||||
}), 400
|
||||
|
||||
# Validate state value
|
||||
if data['state'] not in ['working', 'stopped']:
|
||||
return jsonify({
|
||||
'success': False,
|
||||
'message': 'Invalid state value. Must be "working" or "stopped"'
|
||||
}), 400
|
||||
|
||||
# Parse timestamp if provided, otherwise use current time
|
||||
timestamp = None
|
||||
if 'ts' in data and data['ts']:
|
||||
try:
|
||||
timestamp = datetime.fromisoformat(data['ts'].replace('Z', '+00:00'))
|
||||
except ValueError:
|
||||
return jsonify({
|
||||
'success': False,
|
||||
'message': 'Invalid timestamp format. Use ISO 8601 (YYYY-MM-DDTHH:MM:SSZ)'
|
||||
}), 400
|
||||
|
||||
# Create and store the event
|
||||
user = data['user']
|
||||
state = data['state']
|
||||
ts_str = data.get('ts') # Optional timestamp
|
||||
|
||||
event_ts = datetime.utcnow()
|
||||
if ts_str:
|
||||
try:
|
||||
# Attempt to parse ISO 8601 format
|
||||
event_ts = datetime.fromisoformat(ts_str.replace('Z', '+00:00'))
|
||||
except ValueError:
|
||||
app.logger.warning(f"Invalid timestamp format received: {ts_str}. Using current UTC time.")
|
||||
# Optionally return an error here if strict timestamp validation is needed
|
||||
# return jsonify({"success": False, "message": "Invalid timestamp format"}), 400
|
||||
|
||||
new_event = WorkEvent(user=user, state=state, ts=event_ts)
|
||||
|
||||
try:
|
||||
app.logger.info(f"Attempting to add event to database: User={user}, State={state}, TS={event_ts}")
|
||||
db.session.add(new_event)
|
||||
db.session.commit()
|
||||
app.logger.info(f"Successfully recorded event: User={user}, State={state}") # Already added, maybe refine slightly
|
||||
return jsonify({"success": True}), 201
|
||||
except SQLAlchemyError as e:
|
||||
db.session.rollback()
|
||||
app.logger.error(f"Database error while recording event: {e}")
|
||||
return jsonify({"success": False, "message": "Database error"}), 500
|
||||
except Exception as e:
|
||||
app.logger.error(f"Unexpected error processing report request: {e}") # Refined outer error
|
||||
return jsonify({"success": False, "message": "Internal server error"}), 500
|
||||
|
||||
# --- Helper Functions for Duration Calculation ---
|
||||
|
||||
def calculate_duration_sql(time_period):
|
||||
"""
|
||||
Generates the core SQL query to calculate working durations.
|
||||
Uses LEAD() window function to pair 'working' with the next event.
|
||||
Calculates duration in hours using PostgreSQL functions.
|
||||
"""
|
||||
# PostgreSQL date functions (already compatible with the database)
|
||||
period_grouping = {
|
||||
'daily': "DATE_TRUNC('day', start_time)",
|
||||
'weekly': "DATE_TRUNC('week', start_time)", # PostgreSQL week starts Monday
|
||||
'monthly': "DATE_TRUNC('month', start_time)"
|
||||
}.get(time_period, "DATE_TRUNC('day', start_time)") # Default to daily if invalid
|
||||
|
||||
# Calculate duration using EXTRACT(EPOCH FROM interval) / 3600 for hours
|
||||
duration_calculation = "EXTRACT(EPOCH FROM (next_event_time - start_time)) / 3600.0"
|
||||
|
||||
# Use public schema explicitly, ensure proper aggregation by user and period
|
||||
sql_query = f"""
|
||||
WITH EventPairs AS (
|
||||
SELECT
|
||||
"user",
|
||||
ts AS start_time,
|
||||
state,
|
||||
LEAD(ts) OVER (PARTITION BY "user" ORDER BY ts) AS next_event_time,
|
||||
LEAD(state) OVER (PARTITION BY "user" ORDER BY ts) AS next_event_state
|
||||
FROM public.work_events
|
||||
ORDER BY "user", ts
|
||||
),
|
||||
CalculatedDurations AS (
|
||||
SELECT
|
||||
"user",
|
||||
{period_grouping} AS period_start,
|
||||
SUM(
|
||||
CASE
|
||||
WHEN state = 'working' AND next_event_time IS NOT NULL THEN
|
||||
{duration_calculation}
|
||||
ELSE 0 -- Ignore intervals starting with 'stopped' or without a following event
|
||||
END
|
||||
) AS total_hours,
|
||||
MIN(CASE WHEN state = 'working' THEN start_time END) AS first_login_time
|
||||
FROM EventPairs
|
||||
WHERE state = 'working' -- Only consider intervals that start with 'working'
|
||||
GROUP BY "user", period_start
|
||||
)
|
||||
-- Final aggregation to ensure one row per user per period
|
||||
SELECT
|
||||
"user",
|
||||
period_start,
|
||||
SUM(total_hours) AS total_hours,
|
||||
MIN(first_login_time) AS first_login_time
|
||||
FROM CalculatedDurations
|
||||
GROUP BY "user", period_start
|
||||
ORDER BY "user", period_start DESC;
|
||||
"""
|
||||
|
||||
# Add debug logging to see the SQL query
|
||||
app.logger.info(f"Generated SQL query: {sql_query}")
|
||||
|
||||
return sql_query
|
||||
|
||||
def filter_sql_by_user(base_sql, user):
|
||||
"""Applies a user filter to the SQL query safely."""
|
||||
# Find the position of GROUP BY to insert WHERE clause correctly
|
||||
group_by_pos = base_sql.find("GROUP BY")
|
||||
if group_by_pos != -1:
|
||||
# Make the user filter case-insensitive using LOWER()
|
||||
where_clause = "WHERE state = 'working' AND LOWER(\"user\") = LOWER(:user)\n "
|
||||
# Replace the generic WHERE clause with the user-specific one
|
||||
filtered_sql = base_sql[:base_sql.find("WHERE")] + where_clause + base_sql[group_by_pos:]
|
||||
return filtered_sql
|
||||
else:
|
||||
# Should not happen with the current query structure, but handle defensively
|
||||
return base_sql # Return original if GROUP BY not found
|
||||
|
||||
def fetch_duration_report(time_period, user_filter=None):
|
||||
"""Fetches duration report data from the database."""
|
||||
app.logger.debug(f"Fetching duration report. Period: {time_period}, User: {user_filter}")
|
||||
sql_query = calculate_duration_sql(time_period)
|
||||
params = {}
|
||||
if user_filter:
|
||||
# Note: filter_sql_by_user modifies the query string directly
|
||||
sql_query = filter_sql_by_user(sql_query, user_filter)
|
||||
params['user'] = user_filter
|
||||
app.logger.debug(f"Applying user filter: {user_filter}")
|
||||
|
||||
# Add debug to show database connection info
|
||||
db_uri = app.config['SQLALCHEMY_DATABASE_URI']
|
||||
masked_uri = db_uri
|
||||
if '@' in db_uri:
|
||||
parts = db_uri.split('@')
|
||||
masked_uri = "****@" + parts[1]
|
||||
app.logger.info(f"Executing query using database: {masked_uri}")
|
||||
|
||||
try:
|
||||
# Add a simple count query to verify data in the database
|
||||
count_query = "SELECT COUNT(*) FROM work_events"
|
||||
count_result = db.session.execute(text(count_query)).scalar()
|
||||
app.logger.info(f"Total records in work_events table: {count_result}")
|
||||
|
||||
# Add another query to check distinct users
|
||||
users_query = "SELECT DISTINCT \"user\" FROM work_events"
|
||||
users_result = db.session.execute(text(users_query)).fetchall()
|
||||
user_list = [row[0] for row in users_result]
|
||||
app.logger.info(f"Distinct users in work_events table: {user_list}")
|
||||
|
||||
results = db.session.execute(text(sql_query), params).mappings().all()
|
||||
app.logger.debug(f"Database query executed. Found {len(results)} rows.")
|
||||
return results
|
||||
except Exception as e:
|
||||
app.logger.error(f"Error executing duration report query: {e}")
|
||||
# Re-raise the exception to be handled by the endpoint's error handler
|
||||
raise
|
||||
|
||||
def fetch_user_activity(username, start_date, end_date):
|
||||
"""Fetches detailed user activity logs for a specific date range."""
|
||||
app.logger.debug(f"Fetching activity logs for user: {username}, from: {start_date}, to: {end_date}")
|
||||
|
||||
# SQL query to match working and stopped pairs and calculate durations
|
||||
sql_query = """
|
||||
WITH EventPairs AS (
|
||||
SELECT
|
||||
w1."user",
|
||||
DATE(w1.ts) AS work_date,
|
||||
w1.ts AS start_time,
|
||||
w2.ts AS end_time,
|
||||
EXTRACT(EPOCH FROM (w2.ts - w1.ts))/3600 AS session_duration_hours
|
||||
FROM
|
||||
work_events w1
|
||||
JOIN
|
||||
work_events w2 ON w1."user" = w2."user"
|
||||
AND w1.state = 'working'
|
||||
AND w2.state = 'stopped'
|
||||
AND w2.ts > w1.ts
|
||||
AND NOT EXISTS (
|
||||
SELECT 1 FROM work_events w3
|
||||
WHERE w3."user" = w1."user"
|
||||
AND w3.ts > w1.ts AND w3.ts < w2.ts
|
||||
)
|
||||
WHERE
|
||||
w1."user" = :username
|
||||
AND DATE(w1.ts) BETWEEN :start_date AND :end_date
|
||||
ORDER BY
|
||||
w1.ts
|
||||
)
|
||||
SELECT * FROM EventPairs
|
||||
"""
|
||||
|
||||
try:
|
||||
params = {
|
||||
'username': username,
|
||||
'start_date': start_date,
|
||||
'end_date': end_date
|
||||
}
|
||||
results = db.session.execute(text(sql_query), params).mappings().all()
|
||||
app.logger.debug(f"User activity query executed. Found {len(results)} rows.")
|
||||
return results
|
||||
except Exception as e:
|
||||
app.logger.error(f"Error executing user activity query: {e}")
|
||||
raise
|
||||
|
||||
def format_report_data(results, time_period):
|
||||
"""Formats the raw database results into a list of dictionaries for the API."""
|
||||
app.logger.debug(f"Formatting report data for period: {time_period}. Input rows: {len(results)}")
|
||||
period_key_map = {
|
||||
'daily': 'day',
|
||||
'weekly': 'week_start',
|
||||
'monthly': 'month_start'
|
||||
}
|
||||
period_key = period_key_map.get(time_period, 'period_start') # Default if unexpected period
|
||||
|
||||
# First convert raw rows to dictionaries
|
||||
raw_data = []
|
||||
for row in results:
|
||||
# Ensure period_start is converted to string if it's a date/datetime object
|
||||
period_value = row['period_start']
|
||||
if hasattr(period_value, 'isoformat'):
|
||||
period_value = period_value.isoformat()
|
||||
|
||||
# Format first_login_time
|
||||
first_login_time = row['first_login_time']
|
||||
if hasattr(first_login_time, 'isoformat'):
|
||||
first_login_time = first_login_time.isoformat()
|
||||
|
||||
# Ensure duration_hours is a float, not a string or Decimal
|
||||
duration_hours = row['total_hours']
|
||||
if duration_hours is None:
|
||||
duration_hours = 0.0
|
||||
else:
|
||||
# Convert to float explicitly to ensure it's JSON serializable as a number
|
||||
duration_hours = float(duration_hours)
|
||||
|
||||
raw_data.append({
|
||||
'user': row['user'],
|
||||
period_key: period_value,
|
||||
'duration_hours': duration_hours,
|
||||
'first_login_time': first_login_time
|
||||
})
|
||||
|
||||
# Additional preprocessing to consolidate any duplicate user entries
|
||||
user_period_map = {}
|
||||
for entry in raw_data:
|
||||
user = entry['user']
|
||||
period = entry[period_key]
|
||||
key = f"{user}_{period}"
|
||||
|
||||
if key in user_period_map:
|
||||
# Aggregate duration for existing user+period
|
||||
user_period_map[key]['duration_hours'] += entry['duration_hours']
|
||||
|
||||
# Use the earliest first_login_time
|
||||
existing_time = user_period_map[key]['first_login_time']
|
||||
new_time = entry['first_login_time']
|
||||
|
||||
if existing_time and new_time:
|
||||
if new_time < existing_time:
|
||||
user_period_map[key]['first_login_time'] = new_time
|
||||
else:
|
||||
# New user+period combination
|
||||
user_period_map[key] = entry
|
||||
|
||||
# Convert consolidated map back to list
|
||||
formatted_data = list(user_period_map.values())
|
||||
|
||||
app.logger.debug(f"Formatted report data created. Output rows: {len(formatted_data)}")
|
||||
return formatted_data
|
||||
|
||||
def format_user_activity(results):
|
||||
"""Formats the raw user activity results into a list of dictionaries."""
|
||||
formatted_data = []
|
||||
for row in results:
|
||||
start_time = row['start_time']
|
||||
end_time = row['end_time']
|
||||
|
||||
# Format timestamps for display
|
||||
if hasattr(start_time, 'isoformat'):
|
||||
start_time = start_time.isoformat()
|
||||
if hasattr(end_time, 'isoformat'):
|
||||
end_time = end_time.isoformat()
|
||||
|
||||
# Format duration as float
|
||||
duration = float(row['session_duration_hours']) if row['session_duration_hours'] is not None else 0.0
|
||||
|
||||
formatted_data.append({
|
||||
'date': row['work_date'].isoformat() if hasattr(row['work_date'], 'isoformat') else str(row['work_date']),
|
||||
'start_time': start_time,
|
||||
'end_time': end_time,
|
||||
'duration_hours': round(duration, 2)
|
||||
})
|
||||
return formatted_data
|
||||
|
||||
# --- Reporting Endpoints ---
|
||||
|
||||
# Reporting endpoints (basic implementation)
|
||||
@app.route('/api/reports/daily', methods=['GET'])
|
||||
def get_daily_report():
|
||||
app.logger.info("Daily report API requested.")
|
||||
try:
|
||||
app.logger.info("Fetching daily report data...")
|
||||
|
||||
# Get date parameter or use today as default
|
||||
selected_date = request.args.get('date')
|
||||
if selected_date:
|
||||
app.logger.info(f"Using selected date: {selected_date}")
|
||||
else:
|
||||
selected_date = datetime.now().strftime('%Y-%m-%d')
|
||||
app.logger.info(f"No date provided, using today: {selected_date}")
|
||||
|
||||
user_filter = request.args.get('user')
|
||||
|
||||
# Get regular daily report results
|
||||
results = fetch_duration_report('daily', user_filter)
|
||||
|
||||
# Filter to only include entries for the selected date
|
||||
filtered_results = []
|
||||
for row in results:
|
||||
row_date = row['period_start']
|
||||
if hasattr(row_date, 'isoformat'):
|
||||
row_date = row_date.isoformat()
|
||||
|
||||
# Check if the row's date matches the selected date
|
||||
if row_date and row_date.startswith(selected_date):
|
||||
filtered_results.append(row)
|
||||
|
||||
# Add debug logging for usernames in raw results
|
||||
app.logger.info(f"Raw results usernames for date {selected_date}: {[r['user'] for r in filtered_results]}")
|
||||
|
||||
report = format_report_data(filtered_results, 'daily')
|
||||
|
||||
# Add debug logging for usernames in formatted data
|
||||
app.logger.info(f"Formatted data usernames: {[r['user'] for r in report]}")
|
||||
|
||||
app.logger.info(f"Successfully generated daily report for date: {selected_date}, user: {request.args.get('user', 'All')}. Found {len(filtered_results)} records.")
|
||||
return jsonify({"success": True, "data": report})
|
||||
except Exception as e:
|
||||
app.logger.error(f"Error generating daily report: {e}")
|
||||
return jsonify({"success": False, "message": "Error generating report"}), 500
|
||||
|
||||
@app.route('/api/reports/weekly', methods=['GET'])
|
||||
def get_weekly_report():
|
||||
app.logger.info("Weekly report API requested.")
|
||||
try:
|
||||
app.logger.info("Fetching weekly report data...")
|
||||
|
||||
# Check if a specific day within the week was requested
|
||||
day_filter = request.args.get('day')
|
||||
user_filter = request.args.get('user')
|
||||
|
||||
if day_filter:
|
||||
app.logger.info(f"Filtering weekly report for specific day: {day_filter}")
|
||||
# Use daily query with the specific date
|
||||
results = fetch_duration_report('daily', user_filter)
|
||||
# Filter results to only include the requested day
|
||||
filtered_results = []
|
||||
for row in results:
|
||||
row_date = row['period_start']
|
||||
if hasattr(row_date, 'isoformat'):
|
||||
row_date = row_date.isoformat()
|
||||
|
||||
# Check if the row's date matches the requested day
|
||||
if row_date and row_date.startswith(day_filter):
|
||||
filtered_results.append(row)
|
||||
|
||||
results = filtered_results
|
||||
else:
|
||||
# Get current week dates for filtering
|
||||
now = datetime.now()
|
||||
# Get Monday of current week
|
||||
current_week_start = now - timedelta(days=now.weekday())
|
||||
current_week_start = current_week_start.replace(hour=0, minute=0, second=0, microsecond=0)
|
||||
|
||||
# Regular weekly report (whole week)
|
||||
results = fetch_duration_report('weekly', user_filter)
|
||||
|
||||
# Filter to just include current week and aggregate by user
|
||||
filtered_results = []
|
||||
user_aggregated = {}
|
||||
|
||||
for row in results:
|
||||
row_date = row['period_start']
|
||||
# Convert to datetime if it's a string
|
||||
if isinstance(row_date, str):
|
||||
try:
|
||||
row_date = datetime.fromisoformat(row_date.replace('Z', '+00:00'))
|
||||
except ValueError:
|
||||
continue
|
||||
|
||||
# Check if it's from current week
|
||||
if row_date and row_date.date() == current_week_start.date():
|
||||
username = row['user']
|
||||
|
||||
if username in user_aggregated:
|
||||
# Add duration hours
|
||||
user_aggregated[username]['total_hours'] += row['total_hours']
|
||||
|
||||
# Keep earliest first_login_time
|
||||
if row['first_login_time'] and user_aggregated[username]['first_login_time']:
|
||||
row_login = row['first_login_time']
|
||||
if isinstance(row_login, str):
|
||||
try:
|
||||
row_login = datetime.fromisoformat(row_login.replace('Z', '+00:00'))
|
||||
except ValueError:
|
||||
row_login = None
|
||||
|
||||
existing_login = user_aggregated[username]['first_login_time']
|
||||
if isinstance(existing_login, str):
|
||||
try:
|
||||
existing_login = datetime.fromisoformat(existing_login.replace('Z', '+00:00'))
|
||||
except ValueError:
|
||||
existing_login = None
|
||||
|
||||
if row_login and existing_login and row_login < existing_login:
|
||||
user_aggregated[username]['first_login_time'] = row['first_login_time']
|
||||
else:
|
||||
# First entry for this user
|
||||
user_aggregated[username] = {
|
||||
'user': username,
|
||||
'period_start': row_date,
|
||||
'total_hours': row['total_hours'],
|
||||
'first_login_time': row['first_login_time']
|
||||
}
|
||||
|
||||
# Convert aggregated dict back to list
|
||||
filtered_results = list(user_aggregated.values())
|
||||
results = filtered_results if filtered_results else results
|
||||
|
||||
# Add debug logging for usernames in raw results
|
||||
app.logger.info(f"Raw results usernames: {[r['user'] for r in results]}")
|
||||
|
||||
report = format_report_data(results, 'weekly')
|
||||
|
||||
# Add debug logging for usernames in formatted data
|
||||
app.logger.info(f"Formatted data usernames: {[r['user'] for r in report]}")
|
||||
|
||||
app.logger.info(f"Successfully generated weekly report for user: {request.args.get('user', 'All')}. Found {len(results)} records.")
|
||||
return jsonify({"success": True, "data": report})
|
||||
except Exception as e:
|
||||
app.logger.error(f"Error generating weekly report: {e}")
|
||||
return jsonify({"success": False, "message": "Error generating report"}), 500
|
||||
|
||||
@app.route('/api/reports/monthly', methods=['GET'])
|
||||
def get_monthly_report():
|
||||
app.logger.info("Monthly report API requested.")
|
||||
try:
|
||||
app.logger.info("Fetching monthly report data...")
|
||||
|
||||
# Get current month for filtering
|
||||
now = datetime.now()
|
||||
current_month_start = now.replace(day=1, hour=0, minute=0, second=0, microsecond=0)
|
||||
|
||||
# Get regular monthly report
|
||||
results = fetch_duration_report('monthly', request.args.get('user'))
|
||||
|
||||
# Filter to only include current month and aggregate by user
|
||||
filtered_results = []
|
||||
user_aggregated = {}
|
||||
|
||||
for row in results:
|
||||
row_date = row['period_start']
|
||||
# Convert to datetime if it's a string
|
||||
if isinstance(row_date, str):
|
||||
try:
|
||||
row_date = datetime.fromisoformat(row_date.replace('Z', '+00:00'))
|
||||
except ValueError:
|
||||
continue
|
||||
|
||||
# Check if it's from current month
|
||||
if row_date and row_date.year == current_month_start.year and row_date.month == current_month_start.month:
|
||||
username = row['user']
|
||||
|
||||
if username in user_aggregated:
|
||||
# Add duration hours
|
||||
user_aggregated[username]['total_hours'] += row['total_hours']
|
||||
else:
|
||||
# First entry for this user
|
||||
user_aggregated[username] = {
|
||||
'user': username,
|
||||
'period_start': row_date,
|
||||
'total_hours': row['total_hours'],
|
||||
'first_login_time': row['first_login_time']
|
||||
}
|
||||
|
||||
# Convert aggregated dict back to list
|
||||
filtered_results = list(user_aggregated.values())
|
||||
results = filtered_results if filtered_results else results
|
||||
|
||||
# Add debug logging for usernames in raw results
|
||||
app.logger.info(f"Raw results usernames: {[r['user'] for r in results]}")
|
||||
|
||||
report = format_report_data(results, 'monthly')
|
||||
|
||||
# Add debug logging for usernames in formatted data
|
||||
app.logger.info(f"Formatted data usernames: {[r['user'] for r in report]}")
|
||||
|
||||
app.logger.info(f"Successfully generated monthly report for user: {request.args.get('user', 'All')}. Found {len(results)} records.")
|
||||
return jsonify({"success": True, "data": report})
|
||||
except Exception as e:
|
||||
app.logger.error(f"Error generating monthly report: {e}")
|
||||
return jsonify({"success": False, "message": "Error generating report"}), 500
|
||||
|
||||
@app.route('/api/user-activity/<username>', methods=['GET'])
|
||||
def get_user_activity(username):
|
||||
"""Gets detailed activity logs for a specific user."""
|
||||
app.logger.info(f"User activity logs requested for: {username}")
|
||||
|
||||
# Get date range from query parameters, default to current day if not provided
|
||||
start_date = request.args.get('start_date', datetime.now().strftime('%Y-%m-%d'))
|
||||
end_date = request.args.get('end_date', start_date)
|
||||
|
||||
try:
|
||||
app.logger.info(f"Fetching activity logs for user: {username}, from: {start_date}, to: {end_date}")
|
||||
results = fetch_user_activity(username, start_date, end_date)
|
||||
activity_logs = format_user_activity(results)
|
||||
app.logger.info(f"Successfully retrieved {len(activity_logs)} activity records for user: {username}")
|
||||
return jsonify({
|
||||
"success": True,
|
||||
"data": {
|
||||
"username": username,
|
||||
"start_date": start_date,
|
||||
"end_date": end_date,
|
||||
"activities": activity_logs
|
||||
}
|
||||
})
|
||||
except Exception as e:
|
||||
app.logger.error(f"Error retrieving user activity logs: {e}")
|
||||
return jsonify({"success": False, "message": "Error retrieving activity logs"}), 500
|
||||
|
||||
# Error handlers
|
||||
@app.errorhandler(400)
|
||||
def bad_request(error):
|
||||
return jsonify({
|
||||
'success': False,
|
||||
'message': 'Bad request'
|
||||
}), 400
|
||||
|
||||
@app.errorhandler(404)
|
||||
def not_found_error(error):
|
||||
app.logger.warning(f"404 Not Found error triggered for URL: {request.url}")
|
||||
return jsonify({"success": False, "message": "Resource not found"}), 404
|
||||
|
||||
@app.errorhandler(405)
|
||||
def method_not_allowed(error):
|
||||
return jsonify({
|
||||
'success': False,
|
||||
'message': 'Method not allowed'
|
||||
}), 405
|
||||
|
||||
@app.errorhandler(500)
|
||||
def internal_error(error):
|
||||
# Note: The specific error causing the 500 might have already been logged
|
||||
app.logger.error(f"Global 500 Internal Server error handler triggered: {error}")
|
||||
return jsonify({"success": False, "message": "Internal server error"}), 500
|
||||
|
||||
# Simple dashboard (optional)
|
||||
@app.route('/')
|
||||
def dashboard():
|
||||
app.logger.info("Dashboard page requested.")
|
||||
|
||||
# Add direct query to verify data
|
||||
try:
|
||||
# Direct query to list all distinct users
|
||||
direct_query = "SELECT DISTINCT \"user\" FROM work_events"
|
||||
direct_results = db.session.execute(text(direct_query)).fetchall()
|
||||
user_list = [row[0] for row in direct_results]
|
||||
app.logger.info(f"DIRECT QUERY - Distinct users in database: {user_list}")
|
||||
|
||||
# Direct query to count records
|
||||
count_query = "SELECT COUNT(*) FROM work_events"
|
||||
count_result = db.session.execute(text(count_query)).scalar()
|
||||
app.logger.info(f"DIRECT QUERY - Total records in work_events: {count_result}")
|
||||
|
||||
# Get first few records to inspect
|
||||
sample_query = "SELECT id, \"user\", state, ts FROM work_events LIMIT 5"
|
||||
sample_results = db.session.execute(text(sample_query)).fetchall()
|
||||
app.logger.info(f"DIRECT QUERY - Sample records: {sample_results}")
|
||||
|
||||
# Check the current schema name
|
||||
schema_query = "SELECT current_schema()"
|
||||
schema_result = db.session.execute(text(schema_query)).scalar()
|
||||
app.logger.info(f"DIRECT QUERY - Current schema: {schema_result}")
|
||||
|
||||
# List all schemas in the database
|
||||
schemas_query = "SELECT schema_name FROM information_schema.schemata"
|
||||
schemas_results = db.session.execute(text(schemas_query)).fetchall()
|
||||
schema_list = [row[0] for row in schemas_results]
|
||||
app.logger.info(f"DIRECT QUERY - Available schemas: {schema_list}")
|
||||
|
||||
# List all tables in the public schema
|
||||
tables_query = "SELECT table_name FROM information_schema.tables WHERE table_schema = 'public'"
|
||||
tables_results = db.session.execute(text(tables_query)).fetchall()
|
||||
table_list = [row[0] for row in tables_results]
|
||||
app.logger.info(f"DIRECT QUERY - Tables in public schema: {table_list}")
|
||||
|
||||
except Exception as e:
|
||||
app.logger.error(f"Error during direct database query debugging: {str(e)}")
|
||||
|
||||
return render_template('dashboard.html')
|
||||
|
||||
# Create the database tables
|
||||
def init_db():
|
||||
with app.app_context():
|
||||
# Log the full database URI (with sensitive info removed)
|
||||
db_uri = app.config['SQLALCHEMY_DATABASE_URI']
|
||||
if 'postgresql' in db_uri:
|
||||
# Mask password if using PostgreSQL
|
||||
masked_uri = db_uri.replace(db_uri.split('@')[0], 'postgresql://****:****')
|
||||
app.logger.info(f"Using database URI: {masked_uri}")
|
||||
|
||||
# Initialize PostgreSQL-specific components
|
||||
app.logger.info("Detected PostgreSQL database")
|
||||
|
||||
# Check for extensions (can add more as needed)
|
||||
try:
|
||||
db.session.execute(text("SELECT 1 FROM pg_extension WHERE extname = 'pgcrypto'"))
|
||||
app.logger.info("PostgreSQL database version check completed")
|
||||
except Exception as e:
|
||||
app.logger.warning(f"PostgreSQL extension check failed: {e}")
|
||||
else:
|
||||
app.logger.info(f"Using database URI: {db_uri}")
|
||||
|
||||
# For SQLite, ensure parent directory exists and is writable
|
||||
if db_uri and db_uri.startswith('sqlite:///'):
|
||||
db_file = db_uri.replace('sqlite:///', '')
|
||||
db_dir = os.path.dirname(db_file)
|
||||
app.logger.info(f"SQLite database file path: {db_file}")
|
||||
app.logger.info(f"SQLite database directory exists: {os.path.exists(db_dir)}")
|
||||
if os.path.exists(db_dir):
|
||||
app.logger.info(f"SQLite database directory writable: {os.access(db_dir, os.W_OK)}")
|
||||
|
||||
# Create the tables
|
||||
db.create_all()
|
||||
app.logger.info("Database initialized")
|
||||
|
||||
if __name__ == '__main__':
|
||||
print("Starting application...")
|
||||
# Create database tables if they don't exist
|
||||
print(f"Instance path: {app.instance_path}")
|
||||
instance_exists = os.path.exists(app.instance_path)
|
||||
print(f"Instance path exists: {instance_exists}")
|
||||
|
||||
# Make sure the instance directory exists
|
||||
if not instance_exists:
|
||||
try:
|
||||
os.makedirs(app.instance_path)
|
||||
print(f"Created instance directory: {app.instance_path}")
|
||||
except Exception as e:
|
||||
print(f"Error creating instance directory: {e}")
|
||||
|
||||
# Check instance directory permissions
|
||||
has_write_access = os.access(app.instance_path, os.W_OK)
|
||||
print(f"Instance path write access: {has_write_access}")
|
||||
|
||||
# Print database configuration
|
||||
db_uri = app.config['SQLALCHEMY_DATABASE_URI']
|
||||
print(f"Database URI: {db_uri}")
|
||||
|
||||
# For SQLite, print more details about the database file
|
||||
if db_uri.startswith('sqlite:///'):
|
||||
db_file = db_uri.replace('sqlite:///', '')
|
||||
print(f"Database file path: {db_file}")
|
||||
db_dir = os.path.dirname(db_file)
|
||||
print(f"Database directory: {db_dir}")
|
||||
print(f"Database directory exists: {os.path.exists(db_dir)}")
|
||||
print(f"Database directory writable: {os.access(db_dir, os.W_OK)}")
|
||||
print(f"Database file exists: {os.path.exists(db_file)}")
|
||||
if os.path.exists(db_file):
|
||||
print(f"Database file writable: {os.access(db_file, os.W_OK)}")
|
||||
|
||||
# Use try/except to catch and log any initialization errors
|
||||
try:
|
||||
print("Initializing database...")
|
||||
init_db() # Ensure database and tables are created on first run
|
||||
print("Database initialization successful")
|
||||
except Exception as e:
|
||||
print(f"Error during database initialization: {e}")
|
||||
import traceback
|
||||
traceback.print_exc()
|
||||
|
||||
# Check if database file was created (for SQLite)
|
||||
if db_uri.startswith('sqlite:///'):
|
||||
db_file = db_uri.replace('sqlite:///', '')
|
||||
print(f"After init: Database file exists: {os.path.exists(db_file)}")
|
||||
|
||||
# Run the Flask application
|
||||
host = os.environ.get('HOST', '0.0.0.0')
|
||||
port = int(os.environ.get('PORT', 5000))
|
||||
debug = os.environ.get('DEBUG', 'False').lower() == 'true'
|
||||
|
||||
print(f"Starting Flask application on {host}:{port} (debug={debug})")
|
||||
app.run(host=host, port=port, debug=debug)
|
||||
108
app/__init__.py
Normal file
108
app/__init__.py
Normal file
@ -0,0 +1,108 @@
|
||||
"""
|
||||
Employee Workstation Activity Tracking - Flask Application Factory
|
||||
|
||||
This module provides the application factory function 'create_app' that initializes
|
||||
the Flask application with its configuration, database connection, and registered blueprints.
|
||||
"""
|
||||
import os
|
||||
import logging
|
||||
from logging.handlers import RotatingFileHandler
|
||||
from flask import Flask
|
||||
from flask_sqlalchemy import SQLAlchemy
|
||||
from dotenv import load_dotenv
|
||||
|
||||
# Initialize SQLAlchemy globally to avoid circular imports
|
||||
db = SQLAlchemy()
|
||||
|
||||
def create_app(test_config=None):
|
||||
"""Create and configure the Flask application using the factory pattern."""
|
||||
|
||||
# Load environment variables
|
||||
load_dotenv() # Try .env first
|
||||
config_env_path = os.path.join(os.path.dirname(os.path.dirname(__file__)), 'config.env')
|
||||
if os.path.exists(config_env_path):
|
||||
load_dotenv(config_env_path)
|
||||
print(f"Loaded environment variables from {config_env_path}")
|
||||
else:
|
||||
print(f"Warning: config.env file not found at {config_env_path}")
|
||||
|
||||
# Get the project root directory (parent of app directory)
|
||||
project_root = os.path.dirname(os.path.dirname(__file__))
|
||||
|
||||
# Create and configure the app with template folder in project root
|
||||
app = Flask(__name__,
|
||||
instance_relative_config=True,
|
||||
template_folder=os.path.join(project_root, 'templates'),
|
||||
static_folder=os.path.join(project_root, 'static'))
|
||||
|
||||
# Default configuration
|
||||
app.config.from_mapping(
|
||||
SECRET_KEY=os.environ.get('SECRET_KEY', 'dev'),
|
||||
SQLALCHEMY_DATABASE_URI=os.environ.get('DATABASE_URI', os.environ.get('DATABASE_URL')),
|
||||
SQLALCHEMY_TRACK_MODIFICATIONS=False
|
||||
)
|
||||
|
||||
# Override configuration with test config if provided
|
||||
if test_config is not None:
|
||||
app.config.update(test_config)
|
||||
|
||||
# Ensure DATABASE_URI is set
|
||||
if not app.config['SQLALCHEMY_DATABASE_URI']:
|
||||
raise ValueError("DATABASE_URI or DATABASE_URL environment variable must be set")
|
||||
|
||||
# Ensure the instance folder exists
|
||||
try:
|
||||
os.makedirs(app.instance_path)
|
||||
except OSError:
|
||||
pass
|
||||
|
||||
# Configure logging
|
||||
if not app.debug: # Avoid duplicate logs in debug mode
|
||||
log_formatter = logging.Formatter('%(asctime)s - %(levelname)s - %(message)s')
|
||||
log_handler = RotatingFileHandler(
|
||||
os.path.join(app.instance_path, 'server.log'),
|
||||
maxBytes=1024 * 1024 * 5, # 5 MB
|
||||
backupCount=5
|
||||
)
|
||||
log_handler.setFormatter(log_formatter)
|
||||
log_handler.setLevel(logging.INFO)
|
||||
app.logger.addHandler(log_handler)
|
||||
app.logger.setLevel(logging.INFO)
|
||||
|
||||
app.logger.info('Flask application starting up...')
|
||||
|
||||
# Initialize database with the app
|
||||
db.init_app(app)
|
||||
|
||||
# Import and register blueprints
|
||||
from app.api import events_bp, reports_bp
|
||||
app.register_blueprint(events_bp)
|
||||
app.register_blueprint(reports_bp)
|
||||
|
||||
from app.views.dashboard import views_bp
|
||||
app.register_blueprint(views_bp)
|
||||
|
||||
# Register error handlers
|
||||
from app.errors import register_error_handlers
|
||||
register_error_handlers(app)
|
||||
|
||||
# Initialize database tables when in development
|
||||
@app.cli.command("init-db")
|
||||
def init_db_command():
|
||||
"""Clear existing data and create new tables."""
|
||||
with app.app_context():
|
||||
db.create_all()
|
||||
app.logger.info("Database tables created")
|
||||
|
||||
@app.route('/healthcheck')
|
||||
def healthcheck():
|
||||
return {'status': 'ok'}, 200
|
||||
|
||||
return app
|
||||
|
||||
def init_db():
|
||||
"""Initialize the database outside of the CLI context"""
|
||||
app = create_app()
|
||||
with app.app_context():
|
||||
db.create_all()
|
||||
app.logger.info("Database initialized")
|
||||
10
app/api/__init__.py
Normal file
10
app/api/__init__.py
Normal file
@ -0,0 +1,10 @@
|
||||
"""
|
||||
API package for employee workstation activity tracking.
|
||||
|
||||
This package contains the API endpoints for reporting events and retrieving data.
|
||||
"""
|
||||
|
||||
from app.api.events import events_bp
|
||||
from app.api.reports import reports_bp
|
||||
|
||||
__all__ = ['events_bp', 'reports_bp']
|
||||
85
app/api/events.py
Normal file
85
app/api/events.py
Normal file
@ -0,0 +1,85 @@
|
||||
"""
|
||||
API endpoints for reporting user activity events.
|
||||
|
||||
This module provides endpoints for clients to report state changes (working/stopped).
|
||||
"""
|
||||
from datetime import datetime
|
||||
from flask import Blueprint, request, jsonify, current_app
|
||||
from sqlalchemy.exc import SQLAlchemyError
|
||||
|
||||
from app import db
|
||||
from app.models import WorkEvent
|
||||
|
||||
# Create a blueprint for event-related API endpoints
|
||||
events_bp = Blueprint('events', __name__, url_prefix='/api')
|
||||
|
||||
@events_bp.route('/report', methods=['POST'])
|
||||
def report_event():
|
||||
"""
|
||||
Endpoint for clients to report activity state changes.
|
||||
|
||||
Expected JSON payload:
|
||||
{
|
||||
"user": "username",
|
||||
"state": "working|stopped",
|
||||
"ts": "2023-07-08T12:30:45Z" (optional, ISO 8601)
|
||||
}
|
||||
"""
|
||||
data = request.get_json()
|
||||
current_app.logger.info(f"Received report request: {data}") # Log request
|
||||
|
||||
if not data or 'user' not in data or 'state' not in data:
|
||||
current_app.logger.warning("Invalid report request payload.")
|
||||
return jsonify({
|
||||
'success': False,
|
||||
'message': 'Missing required fields: user, state'
|
||||
}), 400
|
||||
|
||||
# Validate state value
|
||||
if data['state'] not in ['working', 'stopped']:
|
||||
return jsonify({
|
||||
'success': False,
|
||||
'message': 'Invalid state value. Must be "working" or "stopped"'
|
||||
}), 400
|
||||
|
||||
# Parse timestamp if provided, otherwise use current time
|
||||
timestamp = None
|
||||
if 'ts' in data and data['ts']:
|
||||
try:
|
||||
timestamp = datetime.fromisoformat(data['ts'].replace('Z', '+00:00'))
|
||||
except ValueError:
|
||||
return jsonify({
|
||||
'success': False,
|
||||
'message': 'Invalid timestamp format. Use ISO 8601 (YYYY-MM-DDTHH:MM:SSZ)'
|
||||
}), 400
|
||||
|
||||
# Create and store the event
|
||||
user = data['user']
|
||||
state = data['state']
|
||||
ts_str = data.get('ts') # Optional timestamp
|
||||
|
||||
event_ts = datetime.utcnow()
|
||||
if ts_str:
|
||||
try:
|
||||
# Attempt to parse ISO 8601 format
|
||||
event_ts = datetime.fromisoformat(ts_str.replace('Z', '+00:00'))
|
||||
except ValueError:
|
||||
current_app.logger.warning(f"Invalid timestamp format received: {ts_str}. Using current UTC time.")
|
||||
# Optionally return an error here if strict timestamp validation is needed
|
||||
# return jsonify({"success": False, "message": "Invalid timestamp format"}), 400
|
||||
|
||||
new_event = WorkEvent(user=user, state=state, ts=event_ts)
|
||||
|
||||
try:
|
||||
current_app.logger.info(f"Attempting to add event to database: User={user}, State={state}, TS={event_ts}")
|
||||
db.session.add(new_event)
|
||||
db.session.commit()
|
||||
current_app.logger.info(f"Successfully recorded event: User={user}, State={state}")
|
||||
return jsonify({"success": True}), 201
|
||||
except SQLAlchemyError as e:
|
||||
db.session.rollback()
|
||||
current_app.logger.error(f"Database error while recording event: {e}")
|
||||
return jsonify({"success": False, "message": "Database error"}), 500
|
||||
except Exception as e:
|
||||
current_app.logger.error(f"Unexpected error processing report request: {e}")
|
||||
return jsonify({"success": False, "message": "Internal server error"}), 500
|
||||
384
app/api/reports.py
Normal file
384
app/api/reports.py
Normal file
@ -0,0 +1,384 @@
|
||||
"""
|
||||
API endpoints for retrieving activity reports.
|
||||
|
||||
This module provides endpoints for retrieving daily, weekly, and monthly reports,
|
||||
as well as detailed user activity logs.
|
||||
"""
|
||||
from datetime import datetime, timedelta
|
||||
from flask import Blueprint, request, jsonify, current_app
|
||||
from sqlalchemy import text
|
||||
from sqlalchemy.exc import SQLAlchemyError
|
||||
|
||||
from app import db
|
||||
from app.utils.queries import calculate_duration_sql, filter_sql_by_user
|
||||
from app.utils.formatting import format_report_data, format_user_activity
|
||||
|
||||
# Create a blueprint for report-related API endpoints
|
||||
reports_bp = Blueprint('reports', __name__, url_prefix='/api')
|
||||
|
||||
def fetch_duration_report(time_period, user_filter=None):
|
||||
"""
|
||||
Fetches duration report data from the database.
|
||||
|
||||
Args:
|
||||
time_period (str): Time period to group by ('daily', 'weekly', or 'monthly')
|
||||
user_filter (str, optional): Username to filter results by
|
||||
|
||||
Returns:
|
||||
list: List of report data rows
|
||||
"""
|
||||
current_app.logger.debug(f"Fetching duration report. Period: {time_period}, User: {user_filter}")
|
||||
sql_query = calculate_duration_sql(time_period)
|
||||
params = {}
|
||||
if user_filter:
|
||||
# Note: filter_sql_by_user modifies the query string directly
|
||||
sql_query = filter_sql_by_user(sql_query, user_filter)
|
||||
params['user'] = user_filter
|
||||
current_app.logger.debug(f"Applying user filter: {user_filter}")
|
||||
|
||||
# Add debug to show database connection info
|
||||
db_uri = current_app.config['SQLALCHEMY_DATABASE_URI']
|
||||
masked_uri = db_uri
|
||||
if '@' in db_uri:
|
||||
parts = db_uri.split('@')
|
||||
masked_uri = "****@" + parts[1]
|
||||
current_app.logger.info(f"Executing query using database: {masked_uri}")
|
||||
|
||||
try:
|
||||
# Add a simple count query to verify data in the database
|
||||
count_query = "SELECT COUNT(*) FROM work_events"
|
||||
count_result = db.session.execute(text(count_query)).scalar()
|
||||
current_app.logger.info(f"Total records in work_events table: {count_result}")
|
||||
|
||||
# Add another query to check distinct users
|
||||
users_query = "SELECT DISTINCT \"user\" FROM work_events"
|
||||
users_result = db.session.execute(text(users_query)).fetchall()
|
||||
user_list = [row[0] for row in users_result]
|
||||
current_app.logger.info(f"Distinct users in work_events table: {user_list}")
|
||||
|
||||
results = db.session.execute(text(sql_query), params).mappings().all()
|
||||
current_app.logger.debug(f"Database query executed. Found {len(results)} rows.")
|
||||
return results
|
||||
except Exception as e:
|
||||
current_app.logger.error(f"Error executing duration report query: {e}")
|
||||
# Re-raise the exception to be handled by the endpoint's error handler
|
||||
raise
|
||||
|
||||
def fetch_user_activity(username, start_date, end_date):
|
||||
"""
|
||||
Fetches detailed user activity logs for a specific date range.
|
||||
|
||||
Args:
|
||||
username (str): Username to fetch activity for
|
||||
start_date (str): Start date in YYYY-MM-DD format
|
||||
end_date (str): End date in YYYY-MM-DD format
|
||||
|
||||
Returns:
|
||||
list: List of user activity rows
|
||||
"""
|
||||
current_app.logger.debug(f"Fetching activity logs for user: {username}, from: {start_date}, to: {end_date}")
|
||||
|
||||
# SQL query to match working and stopped pairs and calculate durations
|
||||
sql_query = """
|
||||
WITH EventPairs AS (
|
||||
SELECT
|
||||
w1."user",
|
||||
DATE(w1.ts) AS work_date,
|
||||
w1.ts AS start_time,
|
||||
w2.ts AS end_time,
|
||||
EXTRACT(EPOCH FROM (w2.ts - w1.ts))/3600 AS session_duration_hours
|
||||
FROM
|
||||
work_events w1
|
||||
JOIN
|
||||
work_events w2 ON w1."user" = w2."user"
|
||||
AND w1.state = 'working'
|
||||
AND w2.state = 'stopped'
|
||||
AND w2.ts > w1.ts
|
||||
AND NOT EXISTS (
|
||||
SELECT 1 FROM work_events w3
|
||||
WHERE w3."user" = w1."user"
|
||||
AND w3.ts > w1.ts AND w3.ts < w2.ts
|
||||
)
|
||||
WHERE
|
||||
w1."user" = :username
|
||||
AND DATE(w1.ts) BETWEEN :start_date AND :end_date
|
||||
ORDER BY
|
||||
w1.ts
|
||||
)
|
||||
SELECT * FROM EventPairs
|
||||
"""
|
||||
|
||||
try:
|
||||
params = {
|
||||
'username': username,
|
||||
'start_date': start_date,
|
||||
'end_date': end_date
|
||||
}
|
||||
results = db.session.execute(text(sql_query), params).mappings().all()
|
||||
current_app.logger.debug(f"User activity query executed. Found {len(results)} rows.")
|
||||
return results
|
||||
except Exception as e:
|
||||
current_app.logger.error(f"Error executing user activity query: {e}")
|
||||
raise
|
||||
|
||||
@reports_bp.route('/reports/daily', methods=['GET'])
|
||||
def get_daily_report():
|
||||
"""
|
||||
Endpoint for retrieving daily report data.
|
||||
|
||||
Query Parameters:
|
||||
user (str, optional): Filter results by username
|
||||
date (str, optional): Specific date in YYYY-MM-DD format
|
||||
"""
|
||||
current_app.logger.info("Daily report API requested.")
|
||||
try:
|
||||
current_app.logger.info("Fetching daily report data...")
|
||||
|
||||
# Get date parameter or use today as default
|
||||
selected_date = request.args.get('date')
|
||||
if selected_date:
|
||||
current_app.logger.info(f"Using selected date: {selected_date}")
|
||||
else:
|
||||
selected_date = datetime.now().strftime('%Y-%m-%d')
|
||||
current_app.logger.info(f"No date provided, using today: {selected_date}")
|
||||
|
||||
user_filter = request.args.get('user')
|
||||
|
||||
# Get regular daily report results
|
||||
results = fetch_duration_report('daily', user_filter)
|
||||
|
||||
# Filter to only include entries for the selected date
|
||||
filtered_results = []
|
||||
for row in results:
|
||||
row_date = row['period_start']
|
||||
if hasattr(row_date, 'isoformat'):
|
||||
row_date = row_date.isoformat()
|
||||
|
||||
# Check if the row's date matches the selected date
|
||||
if row_date and row_date.startswith(selected_date):
|
||||
filtered_results.append(row)
|
||||
|
||||
# Add debug logging for usernames in raw results
|
||||
current_app.logger.info(f"Raw results usernames for date {selected_date}: {[r['user'] for r in filtered_results]}")
|
||||
|
||||
report = format_report_data(filtered_results, 'daily')
|
||||
|
||||
# Add debug logging for usernames in formatted data
|
||||
current_app.logger.info(f"Formatted data usernames: {[r['user'] for r in report]}")
|
||||
|
||||
current_app.logger.info(f"Successfully generated daily report for date: {selected_date}, user: {request.args.get('user', 'All')}. Found {len(filtered_results)} records.")
|
||||
return jsonify({"success": True, "data": report})
|
||||
except Exception as e:
|
||||
current_app.logger.error(f"Error generating daily report: {e}")
|
||||
return jsonify({"success": False, "message": "Error generating report"}), 500
|
||||
|
||||
@reports_bp.route('/reports/weekly', methods=['GET'])
|
||||
def get_weekly_report():
|
||||
"""
|
||||
Endpoint for retrieving weekly report data.
|
||||
|
||||
Query Parameters:
|
||||
user (str, optional): Filter results by username
|
||||
day (str, optional): Specific day in YYYY-MM-DD format
|
||||
"""
|
||||
current_app.logger.info("Weekly report API requested.")
|
||||
try:
|
||||
current_app.logger.info("Fetching weekly report data...")
|
||||
|
||||
# Check if a specific day within the week was requested
|
||||
day_filter = request.args.get('day')
|
||||
user_filter = request.args.get('user')
|
||||
|
||||
if day_filter:
|
||||
current_app.logger.info(f"Filtering weekly report for specific day: {day_filter}")
|
||||
# Use daily query with the specific date
|
||||
results = fetch_duration_report('daily', user_filter)
|
||||
# Filter results to only include the requested day
|
||||
filtered_results = []
|
||||
for row in results:
|
||||
row_date = row['period_start']
|
||||
if hasattr(row_date, 'isoformat'):
|
||||
row_date = row_date.isoformat()
|
||||
|
||||
# Check if the row's date matches the requested day
|
||||
if row_date and row_date.startswith(day_filter):
|
||||
filtered_results.append(row)
|
||||
|
||||
results = filtered_results
|
||||
else:
|
||||
# Get current week dates for filtering
|
||||
now = datetime.now()
|
||||
# Get Monday of current week
|
||||
current_week_start = now - timedelta(days=now.weekday())
|
||||
current_week_start = current_week_start.replace(hour=0, minute=0, second=0, microsecond=0)
|
||||
|
||||
# Regular weekly report (whole week)
|
||||
results = fetch_duration_report('weekly', user_filter)
|
||||
|
||||
# Filter to just include current week and aggregate by user
|
||||
filtered_results = []
|
||||
user_aggregated = {}
|
||||
|
||||
for row in results:
|
||||
row_date = row['period_start']
|
||||
# Convert to datetime if it's a string
|
||||
if isinstance(row_date, str):
|
||||
try:
|
||||
row_date = datetime.fromisoformat(row_date.replace('Z', '+00:00'))
|
||||
except ValueError:
|
||||
continue
|
||||
|
||||
# Check if it's from current week
|
||||
if row_date and row_date.date() == current_week_start.date():
|
||||
username = row['user']
|
||||
|
||||
if username in user_aggregated:
|
||||
# Add duration hours
|
||||
user_aggregated[username]['total_hours'] += row['total_hours']
|
||||
|
||||
# Keep earliest first_login_time
|
||||
if row['first_login_time'] and user_aggregated[username]['first_login_time']:
|
||||
row_login = row['first_login_time']
|
||||
if isinstance(row_login, str):
|
||||
try:
|
||||
row_login = datetime.fromisoformat(row_login.replace('Z', '+00:00'))
|
||||
except ValueError:
|
||||
row_login = None
|
||||
|
||||
existing_login = user_aggregated[username]['first_login_time']
|
||||
if isinstance(existing_login, str):
|
||||
try:
|
||||
existing_login = datetime.fromisoformat(existing_login.replace('Z', '+00:00'))
|
||||
except ValueError:
|
||||
existing_login = None
|
||||
|
||||
if row_login and existing_login and row_login < existing_login:
|
||||
user_aggregated[username]['first_login_time'] = row['first_login_time']
|
||||
else:
|
||||
# First entry for this user
|
||||
user_aggregated[username] = {
|
||||
'user': username,
|
||||
'period_start': row_date,
|
||||
'total_hours': row['total_hours'],
|
||||
'first_login_time': row['first_login_time']
|
||||
}
|
||||
|
||||
# Convert aggregated dict back to list
|
||||
filtered_results = list(user_aggregated.values())
|
||||
results = filtered_results if filtered_results else results
|
||||
|
||||
# Add debug logging for usernames in raw results
|
||||
current_app.logger.info(f"Raw results usernames: {[r['user'] for r in results]}")
|
||||
|
||||
report = format_report_data(results, 'weekly')
|
||||
|
||||
# Add debug logging for usernames in formatted data
|
||||
current_app.logger.info(f"Formatted data usernames: {[r['user'] for r in report]}")
|
||||
|
||||
current_app.logger.info(f"Successfully generated weekly report for user: {request.args.get('user', 'All')}. Found {len(results)} records.")
|
||||
return jsonify({"success": True, "data": report})
|
||||
except Exception as e:
|
||||
current_app.logger.error(f"Error generating weekly report: {e}")
|
||||
return jsonify({"success": False, "message": "Error generating report"}), 500
|
||||
|
||||
@reports_bp.route('/reports/monthly', methods=['GET'])
|
||||
def get_monthly_report():
|
||||
"""
|
||||
Endpoint for retrieving monthly report data.
|
||||
|
||||
Query Parameters:
|
||||
user (str, optional): Filter results by username
|
||||
"""
|
||||
current_app.logger.info("Monthly report API requested.")
|
||||
try:
|
||||
current_app.logger.info("Fetching monthly report data...")
|
||||
|
||||
# Get current month for filtering
|
||||
now = datetime.now()
|
||||
current_month_start = now.replace(day=1, hour=0, minute=0, second=0, microsecond=0)
|
||||
|
||||
# Get regular monthly report
|
||||
results = fetch_duration_report('monthly', request.args.get('user'))
|
||||
|
||||
# Filter to only include current month and aggregate by user
|
||||
filtered_results = []
|
||||
user_aggregated = {}
|
||||
|
||||
for row in results:
|
||||
row_date = row['period_start']
|
||||
# Convert to datetime if it's a string
|
||||
if isinstance(row_date, str):
|
||||
try:
|
||||
row_date = datetime.fromisoformat(row_date.replace('Z', '+00:00'))
|
||||
except ValueError:
|
||||
continue
|
||||
|
||||
# Check if it's from current month
|
||||
if row_date and row_date.year == current_month_start.year and row_date.month == current_month_start.month:
|
||||
username = row['user']
|
||||
|
||||
if username in user_aggregated:
|
||||
# Add duration hours
|
||||
user_aggregated[username]['total_hours'] += row['total_hours']
|
||||
else:
|
||||
# First entry for this user
|
||||
user_aggregated[username] = {
|
||||
'user': username,
|
||||
'period_start': row_date,
|
||||
'total_hours': row['total_hours'],
|
||||
'first_login_time': row['first_login_time']
|
||||
}
|
||||
|
||||
# Convert aggregated dict back to list
|
||||
filtered_results = list(user_aggregated.values())
|
||||
results = filtered_results if filtered_results else results
|
||||
|
||||
# Add debug logging for usernames in raw results
|
||||
current_app.logger.info(f"Raw results usernames: {[r['user'] for r in results]}")
|
||||
|
||||
report = format_report_data(results, 'monthly')
|
||||
|
||||
# Add debug logging for usernames in formatted data
|
||||
current_app.logger.info(f"Formatted data usernames: {[r['user'] for r in report]}")
|
||||
|
||||
current_app.logger.info(f"Successfully generated monthly report for user: {request.args.get('user', 'All')}. Found {len(results)} records.")
|
||||
return jsonify({"success": True, "data": report})
|
||||
except Exception as e:
|
||||
current_app.logger.error(f"Error generating monthly report: {e}")
|
||||
return jsonify({"success": False, "message": "Error generating report"}), 500
|
||||
|
||||
@reports_bp.route('/user-activity/<username>', methods=['GET'])
|
||||
def get_user_activity(username):
|
||||
"""
|
||||
Gets detailed activity logs for a specific user.
|
||||
|
||||
Path Parameter:
|
||||
username: Username to fetch activity for
|
||||
|
||||
Query Parameters:
|
||||
start_date (str, optional): Start date in YYYY-MM-DD format
|
||||
end_date (str, optional): End date in YYYY-MM-DD format
|
||||
"""
|
||||
current_app.logger.info(f"User activity logs requested for: {username}")
|
||||
|
||||
# Get date range from query parameters, default to current day if not provided
|
||||
start_date = request.args.get('start_date', datetime.now().strftime('%Y-%m-%d'))
|
||||
end_date = request.args.get('end_date', start_date)
|
||||
|
||||
try:
|
||||
current_app.logger.info(f"Fetching activity logs for user: {username}, from: {start_date}, to: {end_date}")
|
||||
results = fetch_user_activity(username, start_date, end_date)
|
||||
activity_logs = format_user_activity(results)
|
||||
current_app.logger.info(f"Successfully retrieved {len(activity_logs)} activity records for user: {username}")
|
||||
return jsonify({
|
||||
"success": True,
|
||||
"data": {
|
||||
"username": username,
|
||||
"start_date": start_date,
|
||||
"end_date": end_date,
|
||||
"activities": activity_logs
|
||||
}
|
||||
})
|
||||
except Exception as e:
|
||||
current_app.logger.error(f"Error retrieving user activity logs: {e}")
|
||||
return jsonify({"success": False, "message": "Error retrieving activity logs"}), 500
|
||||
47
app/errors.py
Normal file
47
app/errors.py
Normal file
@ -0,0 +1,47 @@
|
||||
"""
|
||||
Global error handlers for the application.
|
||||
|
||||
This module provides centralized error handling functions for common HTTP errors.
|
||||
"""
|
||||
from flask import jsonify, request
|
||||
|
||||
def register_error_handlers(app):
|
||||
"""
|
||||
Register global error handlers with the Flask application.
|
||||
|
||||
Args:
|
||||
app: Flask application instance
|
||||
"""
|
||||
|
||||
@app.errorhandler(400)
|
||||
def bad_request(error):
|
||||
"""Handle 400 Bad Request errors."""
|
||||
app.logger.warning(f"400 Bad Request error for URL: {request.url}")
|
||||
return jsonify({
|
||||
'success': False,
|
||||
'message': 'Bad request'
|
||||
}), 400
|
||||
|
||||
@app.errorhandler(404)
|
||||
def not_found_error(error):
|
||||
"""Handle 404 Not Found errors."""
|
||||
app.logger.warning(f"404 Not Found error triggered for URL: {request.url}")
|
||||
return jsonify({"success": False, "message": "Resource not found"}), 404
|
||||
|
||||
@app.errorhandler(405)
|
||||
def method_not_allowed(error):
|
||||
"""Handle 405 Method Not Allowed errors."""
|
||||
app.logger.warning(f"405 Method Not Allowed error for URL: {request.url}")
|
||||
return jsonify({
|
||||
'success': False,
|
||||
'message': 'Method not allowed'
|
||||
}), 405
|
||||
|
||||
@app.errorhandler(500)
|
||||
def internal_error(error):
|
||||
"""Handle 500 Internal Server errors."""
|
||||
# Note: The specific error causing the 500 might have already been logged
|
||||
app.logger.error(f"Global 500 Internal Server error handler triggered: {error}")
|
||||
return jsonify({"success": False, "message": "Internal server error"}), 500
|
||||
|
||||
app.logger.info("Registered global error handlers")
|
||||
38
app/models.py
Normal file
38
app/models.py
Normal file
@ -0,0 +1,38 @@
|
||||
"""
|
||||
Database models for employee workstation activity tracking.
|
||||
|
||||
This module defines the SQLAlchemy models used for storing activity events.
|
||||
"""
|
||||
from app import db
|
||||
|
||||
class WorkEvent(db.Model):
|
||||
"""
|
||||
Represents a user activity event with state transitions (working/stopped).
|
||||
|
||||
Attributes:
|
||||
id (int): Primary key for the event
|
||||
user (str): Username of the person whose activity is being tracked
|
||||
state (str): Current activity state ('working' or 'stopped')
|
||||
ts (datetime): Timestamp when the event occurred
|
||||
"""
|
||||
__tablename__ = 'work_events'
|
||||
|
||||
id = db.Column(db.Integer, primary_key=True)
|
||||
user = db.Column(db.String(100), nullable=False, index=True)
|
||||
state = db.Column(db.String(10), nullable=False) # 'working' or 'stopped'
|
||||
ts = db.Column(db.DateTime, nullable=False,
|
||||
server_default=db.func.current_timestamp(),
|
||||
index=True)
|
||||
|
||||
def __repr__(self):
|
||||
"""Return a string representation of the model."""
|
||||
return f"<WorkEvent(user='{self.user}', state='{self.state}', ts='{self.ts}')>"
|
||||
|
||||
def to_dict(self):
|
||||
"""Convert model to dictionary for API responses."""
|
||||
return {
|
||||
'id': self.id,
|
||||
'user': self.user,
|
||||
'state': self.state,
|
||||
'ts': self.ts.isoformat() if self.ts else None
|
||||
}
|
||||
15
app/utils/__init__.py
Normal file
15
app/utils/__init__.py
Normal file
@ -0,0 +1,15 @@
|
||||
"""
|
||||
Utility functions package for employee workstation activity tracking.
|
||||
|
||||
This package contains helper functions for building queries and formatting data.
|
||||
"""
|
||||
|
||||
from app.utils.queries import calculate_duration_sql, filter_sql_by_user
|
||||
from app.utils.formatting import format_report_data, format_user_activity
|
||||
|
||||
__all__ = [
|
||||
'calculate_duration_sql',
|
||||
'filter_sql_by_user',
|
||||
'format_report_data',
|
||||
'format_user_activity'
|
||||
]
|
||||
113
app/utils/formatting.py
Normal file
113
app/utils/formatting.py
Normal file
@ -0,0 +1,113 @@
|
||||
"""
|
||||
Data formatting functions for employee workstation activity tracking.
|
||||
|
||||
This module contains functions for formatting database query results into API responses.
|
||||
"""
|
||||
from flask import current_app
|
||||
|
||||
def format_report_data(results, time_period):
|
||||
"""
|
||||
Formats the raw database results into a list of dictionaries for the API.
|
||||
|
||||
Args:
|
||||
results (list): List of database result rows (SQLAlchemy RowMapping objects)
|
||||
time_period (str): Time period of the report ('daily', 'weekly', or 'monthly')
|
||||
|
||||
Returns:
|
||||
list: List of formatted dictionaries for API response
|
||||
"""
|
||||
current_app.logger.debug(f"Formatting report data for period: {time_period}. Input rows: {len(results)}")
|
||||
period_key_map = {
|
||||
'daily': 'day',
|
||||
'weekly': 'week_start',
|
||||
'monthly': 'month_start'
|
||||
}
|
||||
period_key = period_key_map.get(time_period, 'period_start') # Default if unexpected period
|
||||
|
||||
# First convert raw rows to dictionaries
|
||||
raw_data = []
|
||||
for row in results:
|
||||
# Ensure period_start is converted to string if it's a date/datetime object
|
||||
period_value = row['period_start']
|
||||
if hasattr(period_value, 'isoformat'):
|
||||
period_value = period_value.isoformat()
|
||||
|
||||
# Format first_login_time
|
||||
first_login_time = row['first_login_time']
|
||||
if hasattr(first_login_time, 'isoformat'):
|
||||
first_login_time = first_login_time.isoformat()
|
||||
|
||||
# Ensure duration_hours is a float, not a string or Decimal
|
||||
duration_hours = row['total_hours']
|
||||
if duration_hours is None:
|
||||
duration_hours = 0.0
|
||||
else:
|
||||
# Convert to float explicitly to ensure it's JSON serializable as a number
|
||||
duration_hours = float(duration_hours)
|
||||
|
||||
raw_data.append({
|
||||
'user': row['user'],
|
||||
period_key: period_value,
|
||||
'duration_hours': duration_hours,
|
||||
'first_login_time': first_login_time
|
||||
})
|
||||
|
||||
# Additional preprocessing to consolidate any duplicate user entries
|
||||
user_period_map = {}
|
||||
for entry in raw_data:
|
||||
user = entry['user']
|
||||
period = entry[period_key]
|
||||
key = f"{user}_{period}"
|
||||
|
||||
if key in user_period_map:
|
||||
# Aggregate duration for existing user+period
|
||||
user_period_map[key]['duration_hours'] += entry['duration_hours']
|
||||
|
||||
# Use the earliest first_login_time
|
||||
existing_time = user_period_map[key]['first_login_time']
|
||||
new_time = entry['first_login_time']
|
||||
|
||||
if existing_time and new_time:
|
||||
if new_time < existing_time:
|
||||
user_period_map[key]['first_login_time'] = new_time
|
||||
else:
|
||||
# New user+period combination
|
||||
user_period_map[key] = entry
|
||||
|
||||
# Convert consolidated map back to list
|
||||
formatted_data = list(user_period_map.values())
|
||||
|
||||
current_app.logger.debug(f"Formatted report data created. Output rows: {len(formatted_data)}")
|
||||
return formatted_data
|
||||
|
||||
def format_user_activity(results):
|
||||
"""
|
||||
Formats the raw user activity results into a list of dictionaries.
|
||||
|
||||
Args:
|
||||
results (list): List of database result rows (SQLAlchemy RowMapping objects)
|
||||
|
||||
Returns:
|
||||
list: List of formatted user activity dictionaries
|
||||
"""
|
||||
formatted_data = []
|
||||
for row in results:
|
||||
start_time = row['start_time']
|
||||
end_time = row['end_time']
|
||||
|
||||
# Format timestamps for display
|
||||
if hasattr(start_time, 'isoformat'):
|
||||
start_time = start_time.isoformat()
|
||||
if hasattr(end_time, 'isoformat'):
|
||||
end_time = end_time.isoformat()
|
||||
|
||||
# Format duration as float
|
||||
duration = float(row['session_duration_hours']) if row['session_duration_hours'] is not None else 0.0
|
||||
|
||||
formatted_data.append({
|
||||
'date': row['work_date'].isoformat() if hasattr(row['work_date'], 'isoformat') else str(row['work_date']),
|
||||
'start_time': start_time,
|
||||
'end_time': end_time,
|
||||
'duration_hours': round(duration, 2)
|
||||
})
|
||||
return formatted_data
|
||||
95
app/utils/queries.py
Normal file
95
app/utils/queries.py
Normal file
@ -0,0 +1,95 @@
|
||||
"""
|
||||
SQL query building functions for employee workstation activity tracking.
|
||||
|
||||
This module contains functions for building SQL queries to calculate working durations.
|
||||
"""
|
||||
from flask import current_app
|
||||
|
||||
def calculate_duration_sql(time_period):
|
||||
"""
|
||||
Generates the core SQL query to calculate working durations.
|
||||
Uses LEAD() window function to pair 'working' with the next event.
|
||||
Calculates duration in hours using PostgreSQL functions.
|
||||
|
||||
Args:
|
||||
time_period (str): Time period to group by ('daily', 'weekly', or 'monthly')
|
||||
|
||||
Returns:
|
||||
str: SQL query for calculating durations
|
||||
"""
|
||||
# PostgreSQL date functions (already compatible with the database)
|
||||
period_grouping = {
|
||||
'daily': "DATE_TRUNC('day', start_time)",
|
||||
'weekly': "DATE_TRUNC('week', start_time)", # PostgreSQL week starts Monday
|
||||
'monthly': "DATE_TRUNC('month', start_time)"
|
||||
}.get(time_period, "DATE_TRUNC('day', start_time)") # Default to daily if invalid
|
||||
|
||||
# Calculate duration using EXTRACT(EPOCH FROM interval) / 3600 for hours
|
||||
duration_calculation = "EXTRACT(EPOCH FROM (next_event_time - start_time)) / 3600.0"
|
||||
|
||||
# Use public schema explicitly, ensure proper aggregation by user and period
|
||||
sql_query = f"""
|
||||
WITH EventPairs AS (
|
||||
SELECT
|
||||
"user",
|
||||
ts AS start_time,
|
||||
state,
|
||||
LEAD(ts) OVER (PARTITION BY "user" ORDER BY ts) AS next_event_time,
|
||||
LEAD(state) OVER (PARTITION BY "user" ORDER BY ts) AS next_event_state
|
||||
FROM public.work_events
|
||||
ORDER BY "user", ts
|
||||
),
|
||||
CalculatedDurations AS (
|
||||
SELECT
|
||||
"user",
|
||||
{period_grouping} AS period_start,
|
||||
SUM(
|
||||
CASE
|
||||
WHEN state = 'working' AND next_event_time IS NOT NULL THEN
|
||||
{duration_calculation}
|
||||
ELSE 0 -- Ignore intervals starting with 'stopped' or without a following event
|
||||
END
|
||||
) AS total_hours,
|
||||
MIN(CASE WHEN state = 'working' THEN start_time END) AS first_login_time
|
||||
FROM EventPairs
|
||||
WHERE state = 'working' -- Only consider intervals that start with 'working'
|
||||
GROUP BY "user", period_start
|
||||
)
|
||||
-- Final aggregation to ensure one row per user per period
|
||||
SELECT
|
||||
"user",
|
||||
period_start,
|
||||
SUM(total_hours) AS total_hours,
|
||||
MIN(first_login_time) AS first_login_time
|
||||
FROM CalculatedDurations
|
||||
GROUP BY "user", period_start
|
||||
ORDER BY "user", period_start DESC;
|
||||
"""
|
||||
|
||||
# Add debug logging to see the SQL query
|
||||
current_app.logger.info(f"Generated SQL query: {sql_query}")
|
||||
|
||||
return sql_query
|
||||
|
||||
def filter_sql_by_user(base_sql, user):
|
||||
"""
|
||||
Applies a user filter to the SQL query safely.
|
||||
|
||||
Args:
|
||||
base_sql (str): The base SQL query to modify
|
||||
user (str): Username to filter by
|
||||
|
||||
Returns:
|
||||
str: SQL query with user filter applied
|
||||
"""
|
||||
# Find the position of GROUP BY to insert WHERE clause correctly
|
||||
group_by_pos = base_sql.find("GROUP BY")
|
||||
if group_by_pos != -1:
|
||||
# Make the user filter case-insensitive using LOWER()
|
||||
where_clause = "WHERE state = 'working' AND LOWER(\"user\") = LOWER(:user)\n "
|
||||
# Replace the generic WHERE clause with the user-specific one
|
||||
filtered_sql = base_sql[:base_sql.find("WHERE")] + where_clause + base_sql[group_by_pos:]
|
||||
return filtered_sql
|
||||
else:
|
||||
# Should not happen with the current query structure, but handle defensively
|
||||
return base_sql # Return original if GROUP BY not found
|
||||
9
app/views/__init__.py
Normal file
9
app/views/__init__.py
Normal file
@ -0,0 +1,9 @@
|
||||
"""
|
||||
Web views package for employee workstation activity tracking.
|
||||
|
||||
This package contains the web routes for the dashboard interface.
|
||||
"""
|
||||
|
||||
from app.views.dashboard import views_bp
|
||||
|
||||
__all__ = ['views_bp']
|
||||
30
app/views/dashboard.py
Normal file
30
app/views/dashboard.py
Normal file
@ -0,0 +1,30 @@
|
||||
"""
|
||||
Web routes for the dashboard interface.
|
||||
|
||||
This module provides the HTML routes for serving the dashboard interface.
|
||||
"""
|
||||
from flask import Blueprint, render_template, current_app
|
||||
|
||||
# Create a blueprint for dashboard views
|
||||
views_bp = Blueprint('views', __name__, url_prefix='/')
|
||||
|
||||
@views_bp.route('/')
|
||||
def dashboard():
|
||||
"""
|
||||
Renders the main dashboard interface.
|
||||
|
||||
Returns:
|
||||
HTML template: The dashboard template
|
||||
"""
|
||||
current_app.logger.info("Dashboard page requested.")
|
||||
return render_template('dashboard.html')
|
||||
|
||||
@views_bp.route('/healthcheck')
|
||||
def healthcheck():
|
||||
"""
|
||||
Simple health check endpoint for monitoring.
|
||||
|
||||
Returns:
|
||||
JSON: Status indicator
|
||||
"""
|
||||
return {'status': 'ok'}, 200
|
||||
82
check_db.py
Normal file
82
check_db.py
Normal file
@ -0,0 +1,82 @@
|
||||
"""
|
||||
PostgreSQL Connection Test Script
|
||||
Verifies connection to PostgreSQL database and lists existing tables.
|
||||
"""
|
||||
import os
|
||||
import sys
|
||||
import traceback
|
||||
from dotenv import load_dotenv
|
||||
from sqlalchemy import create_engine, text
|
||||
|
||||
# Print diagnostic information
|
||||
print("Python version:", sys.version)
|
||||
print("Current working directory:", os.getcwd())
|
||||
|
||||
# Load environment variables from both potential sources
|
||||
print("Loading environment variables...")
|
||||
load_dotenv() # From .env
|
||||
load_dotenv("config.env") # From config.env
|
||||
|
||||
# Print all loaded environment variables (excluding sensitive info)
|
||||
for key, value in os.environ.items():
|
||||
if key.startswith("DATABASE"):
|
||||
masked_value = value
|
||||
if "@" in value:
|
||||
parts = value.split("@")
|
||||
masked_value = "****@" + parts[1]
|
||||
print(f"{key}: {masked_value}")
|
||||
|
||||
# Get database URI
|
||||
db_uri = os.environ.get('DATABASE_URI', os.environ.get('DATABASE_URL'))
|
||||
|
||||
if not db_uri:
|
||||
print("ERROR: DATABASE_URI or DATABASE_URL environment variable not set!")
|
||||
print("Available environment variables:", [key for key in os.environ.keys() if not key.startswith("_")])
|
||||
exit(1)
|
||||
|
||||
print(f"Attempting to connect to database: {db_uri.split('@')[1] if '@' in db_uri else 'unknown'}")
|
||||
|
||||
try:
|
||||
# Create engine with echo for verbose output
|
||||
print("Creating SQLAlchemy engine...")
|
||||
engine = create_engine(db_uri, echo=True)
|
||||
|
||||
# Test connection
|
||||
print("Attempting to connect...")
|
||||
with engine.connect() as conn:
|
||||
print("✓ Successfully connected to PostgreSQL database!")
|
||||
|
||||
# Get list of tables
|
||||
print("Querying for tables...")
|
||||
result = conn.execute(text("SELECT table_name FROM information_schema.tables WHERE table_schema = 'public'"))
|
||||
tables = [row[0] for row in result]
|
||||
|
||||
if tables:
|
||||
print("\nExisting tables:")
|
||||
for table in tables:
|
||||
print(f" - {table}")
|
||||
else:
|
||||
print("\nNo tables found in the database.")
|
||||
|
||||
# Check if work_events table exists
|
||||
if 'work_events' in tables:
|
||||
# Count records
|
||||
result = conn.execute(text("SELECT COUNT(*) FROM work_events"))
|
||||
count = result.scalar()
|
||||
print(f"\nThe work_events table contains {count} records.")
|
||||
|
||||
# Sample data
|
||||
if count > 0:
|
||||
result = conn.execute(text("SELECT * FROM work_events LIMIT 5"))
|
||||
rows = result.fetchall()
|
||||
print("\nSample data:")
|
||||
for row in rows:
|
||||
print(f" {row}")
|
||||
else:
|
||||
print("\nThe work_events table does not exist yet.")
|
||||
|
||||
except Exception as e:
|
||||
print(f"ERROR: Failed to connect to the database: {e}")
|
||||
print("\nDetailed error information:")
|
||||
traceback.print_exc()
|
||||
exit(1)
|
||||
314
create_db.sql
Normal file
314
create_db.sql
Normal file
@ -0,0 +1,314 @@
|
||||
-- Employee Workstation Activity Tracking System
|
||||
-- Database Schema Creation Script
|
||||
|
||||
-- Instructions:
|
||||
-- 1. For SQLite: sqlite3 work_events.db < create_db.sql
|
||||
-- 2. For PostgreSQL: psql -U username -d database_name -f create_db.sql
|
||||
|
||||
-- Drop existing table if it exists
|
||||
DROP TABLE IF EXISTS work_events;
|
||||
|
||||
-- Create the work_events table
|
||||
CREATE TABLE work_events (
|
||||
id SERIAL PRIMARY KEY,
|
||||
"user" VARCHAR(100) NOT NULL, -- Note the double quotes
|
||||
state VARCHAR(10) NOT NULL CHECK (state IN ('working', 'stopped')),
|
||||
ts TIMESTAMP NOT NULL DEFAULT CURRENT_TIMESTAMP
|
||||
);
|
||||
|
||||
-- Create indexes for performance optimization
|
||||
CREATE INDEX idx_work_events_user ON work_events("user");
|
||||
CREATE INDEX idx_work_events_ts ON work_events(ts);
|
||||
CREATE INDEX idx_work_events_user_state ON work_events("user", state);
|
||||
|
||||
-- Optional: Sample data for testing (comment out in production)
|
||||
|
||||
-- Clear previous sample data if necessary before inserting new data
|
||||
-- DELETE FROM work_events WHERE "user" LIKE 'User %' OR "user" LIKE 'user%';
|
||||
|
||||
-- Remove older generic sample data
|
||||
/*
|
||||
INSERT INTO work_events ("user", state, ts) VALUES
|
||||
('user1', 'working', CURRENT_TIMESTAMP - INTERVAL '2 hours'),
|
||||
('user1', 'stopped', CURRENT_TIMESTAMP - INTERVAL '1 hour'),
|
||||
('user1', 'working', CURRENT_TIMESTAMP - INTERVAL '30 minutes'),
|
||||
('user2', 'working', CURRENT_TIMESTAMP - INTERVAL '3 hours'),
|
||||
('user2', 'stopped', CURRENT_TIMESTAMP - INTERVAL '2 hours 30 minutes'),
|
||||
('user2', 'working', CURRENT_TIMESTAMP - INTERVAL '2 hours'),
|
||||
('user2', 'stopped', CURRENT_TIMESTAMP - INTERVAL '1 hour');
|
||||
*/
|
||||
|
||||
-- Example queries for reporting
|
||||
|
||||
-- 1. Daily activity summary per user
|
||||
|
||||
SELECT
|
||||
"user",
|
||||
DATE(ts) AS day,
|
||||
COUNT(*) AS event_count,
|
||||
SUM(CASE WHEN state = 'working' THEN 1 ELSE 0 END) AS working_events
|
||||
FROM work_events
|
||||
GROUP BY "user", day
|
||||
ORDER BY day DESC, "user";
|
||||
|
||||
|
||||
-- 2. Weekly activity summary
|
||||
|
||||
SELECT
|
||||
"user",
|
||||
date_trunc('week', ts) AS week_start,
|
||||
COUNT(*) AS event_count
|
||||
FROM work_events
|
||||
GROUP BY "user", week_start
|
||||
ORDER BY week_start DESC, "user";
|
||||
|
||||
|
||||
-- 3. Monthly activity summary
|
||||
|
||||
SELECT
|
||||
"user",
|
||||
date_trunc('month', ts) AS month_start,
|
||||
COUNT(*) AS event_count
|
||||
FROM work_events
|
||||
GROUP BY "user", month_start
|
||||
ORDER BY month_start DESC, "user";
|
||||
|
||||
|
||||
-- Notes on data retention:
|
||||
-- According to requirements, raw event logs should be retained for 1 year.
|
||||
-- Setup a periodic cleanup job to remove old data:
|
||||
|
||||
-- PostgreSQL:
|
||||
|
||||
DELETE FROM work_events
|
||||
WHERE ts < CURRENT_TIMESTAMP - INTERVAL '1 year';
|
||||
|
||||
|
||||
-- SQLite:
|
||||
/*
|
||||
DELETE FROM work_events
|
||||
WHERE ts < datetime('now', '-1 year');
|
||||
*/
|
||||
|
||||
-- Enhanced Sample Data Generation (Relative to assumed date 2025-05-04)
|
||||
-- Ensures Users A-E have data for Today, This Week, and This Month (April)
|
||||
|
||||
|
||||
|
||||
-- Sample data for real users: Robert, Ilia, and Nika
|
||||
-- Robert: Working pattern for today, this week, and this month
|
||||
INSERT INTO work_events ("user", state, ts) VALUES ('Robert', 'working', '2025-05-04 08:30:00'); -- Today
|
||||
INSERT INTO work_events ("user", state, ts) VALUES ('Robert', 'stopped', '2025-05-04 12:30:00');
|
||||
INSERT INTO work_events ("user", state, ts) VALUES ('Robert', 'working', '2025-05-04 13:15:00');
|
||||
INSERT INTO work_events ("user", state, ts) VALUES ('Robert', 'stopped', '2025-05-04 17:00:00');
|
||||
INSERT INTO work_events ("user", state, ts) VALUES ('Robert', 'working', '2025-05-02 08:45:00'); -- This Week
|
||||
INSERT INTO work_events ("user", state, ts) VALUES ('Robert', 'stopped', '2025-05-02 16:15:00');
|
||||
INSERT INTO work_events ("user", state, ts) VALUES ('Robert', 'working', '2025-04-15 09:00:00'); -- This Month
|
||||
INSERT INTO work_events ("user", state, ts) VALUES ('Robert', 'stopped', '2025-04-15 17:30:00');
|
||||
|
||||
-- Ilia: Working pattern for today, this week, and this month
|
||||
INSERT INTO work_events ("user", state, ts) VALUES ('Ilia', 'working', '2025-05-04 09:00:00'); -- Today
|
||||
INSERT INTO work_events ("user", state, ts) VALUES ('Ilia', 'stopped', '2025-05-04 13:00:00');
|
||||
INSERT INTO work_events ("user", state, ts) VALUES ('Ilia', 'working', '2025-05-04 14:00:00');
|
||||
INSERT INTO work_events ("user", state, ts) VALUES ('Ilia', 'stopped', '2025-05-04 18:00:00');
|
||||
INSERT INTO work_events ("user", state, ts) VALUES ('Ilia', 'working', '2025-05-03 09:30:00'); -- This Week
|
||||
INSERT INTO work_events ("user", state, ts) VALUES ('Ilia', 'stopped', '2025-05-03 17:30:00');
|
||||
INSERT INTO work_events ("user", state, ts) VALUES ('Ilia', 'working', '2025-04-20 08:30:00'); -- This Month
|
||||
INSERT INTO work_events ("user", state, ts) VALUES ('Ilia', 'stopped', '2025-04-20 16:30:00');
|
||||
|
||||
-- Nika: Working pattern for today, this week, and this month
|
||||
INSERT INTO work_events ("user", state, ts) VALUES ('Nika', 'working', '2025-05-04 08:15:00'); -- Today
|
||||
INSERT INTO work_events ("user", state, ts) VALUES ('Nika', 'stopped', '2025-05-04 12:00:00');
|
||||
INSERT INTO work_events ("user", state, ts) VALUES ('Nika', 'working', '2025-05-04 12:45:00');
|
||||
INSERT INTO work_events ("user", state, ts) VALUES ('Nika', 'stopped', '2025-05-04 16:30:00');
|
||||
INSERT INTO work_events ("user", state, ts) VALUES ('Nika', 'working', '2025-05-01 08:30:00'); -- This Week
|
||||
INSERT INTO work_events ("user", state, ts) VALUES ('Nika', 'stopped', '2025-05-01 16:45:00');
|
||||
INSERT INTO work_events ("user", state, ts) VALUES ('Nika', 'working', '2025-04-10 09:15:00'); -- This Month
|
||||
INSERT INTO work_events ("user", state, ts) VALUES ('Nika', 'stopped', '2025-04-10 17:15:00');
|
||||
|
||||
-- Enhanced SQL queries for reporting with duration calculations
|
||||
|
||||
-- 4. Daily working duration per user (for dashboard)
|
||||
-- This calculates the total working hours per day by matching working->stopped pairs
|
||||
SELECT
|
||||
w1."user",
|
||||
DATE(w1.ts) AS day,
|
||||
SUM(
|
||||
EXTRACT(EPOCH FROM (w2.ts - w1.ts))/3600
|
||||
) AS duration_hours,
|
||||
MIN(CASE WHEN w1.state = 'working' THEN w1.ts END) AS first_login_time
|
||||
FROM
|
||||
work_events w1
|
||||
JOIN
|
||||
work_events w2 ON w1."user" = w2."user"
|
||||
AND w1.state = 'working'
|
||||
AND w2.state = 'stopped'
|
||||
AND w2.ts > w1.ts
|
||||
AND NOT EXISTS (
|
||||
SELECT 1 FROM work_events w3
|
||||
WHERE w3."user" = w1."user"
|
||||
AND w3.ts > w1.ts AND w3.ts < w2.ts
|
||||
)
|
||||
GROUP BY
|
||||
w1."user", day
|
||||
ORDER BY
|
||||
day DESC, w1."user";
|
||||
|
||||
-- 5. Weekly working duration per user (for dashboard)
|
||||
SELECT
|
||||
w1."user",
|
||||
date_trunc('week', w1.ts) AS week_start,
|
||||
SUM(
|
||||
EXTRACT(EPOCH FROM (w2.ts - w1.ts))/3600
|
||||
) AS duration_hours,
|
||||
MIN(CASE WHEN w1.state = 'working' THEN w1.ts END) AS first_login_time
|
||||
FROM
|
||||
work_events w1
|
||||
JOIN
|
||||
work_events w2 ON w1."user" = w2."user"
|
||||
AND w1.state = 'working'
|
||||
AND w2.state = 'stopped'
|
||||
AND w2.ts > w1.ts
|
||||
AND NOT EXISTS (
|
||||
SELECT 1 FROM work_events w3
|
||||
WHERE w3."user" = w1."user"
|
||||
AND w3.ts > w1.ts AND w3.ts < w2.ts
|
||||
)
|
||||
GROUP BY
|
||||
w1."user", week_start
|
||||
ORDER BY
|
||||
week_start DESC, w1."user";
|
||||
|
||||
-- 6. Monthly working duration per user (for dashboard)
|
||||
SELECT
|
||||
w1."user",
|
||||
date_trunc('month', w1.ts) AS month_start,
|
||||
SUM(
|
||||
EXTRACT(EPOCH FROM (w2.ts - w1.ts))/3600
|
||||
) AS duration_hours,
|
||||
MIN(CASE WHEN w1.state = 'working' THEN w1.ts END) AS first_login_time
|
||||
FROM
|
||||
work_events w1
|
||||
JOIN
|
||||
work_events w2 ON w1."user" = w2."user"
|
||||
AND w1.state = 'working'
|
||||
AND w2.state = 'stopped'
|
||||
AND w2.ts > w1.ts
|
||||
AND NOT EXISTS (
|
||||
SELECT 1 FROM work_events w3
|
||||
WHERE w3."user" = w1."user"
|
||||
AND w3.ts > w1.ts AND w3.ts < w2.ts
|
||||
)
|
||||
GROUP BY
|
||||
w1."user", month_start
|
||||
ORDER BY
|
||||
month_start DESC, w1."user";
|
||||
|
||||
-- 7. Detailed user activity log for specific user and date range
|
||||
-- This query shows all work sessions with start, end times and duration
|
||||
-- Use :username and :start_date, :end_date as parameters
|
||||
SELECT
|
||||
w1."user",
|
||||
DATE(w1.ts) AS work_date,
|
||||
w1.ts AS start_time,
|
||||
w2.ts AS end_time,
|
||||
EXTRACT(EPOCH FROM (w2.ts - w1.ts))/3600 AS session_duration_hours
|
||||
FROM
|
||||
work_events w1
|
||||
JOIN
|
||||
work_events w2 ON w1."user" = w2."user"
|
||||
AND w1.state = 'working'
|
||||
AND w2.state = 'stopped'
|
||||
AND w2.ts > w1.ts
|
||||
AND NOT EXISTS (
|
||||
SELECT 1 FROM work_events w3
|
||||
WHERE w3."user" = w1."user"
|
||||
AND w3.ts > w1.ts AND w3.ts < w2.ts
|
||||
)
|
||||
WHERE
|
||||
w1."user" = :username
|
||||
AND DATE(w1.ts) BETWEEN :start_date AND :end_date
|
||||
ORDER BY
|
||||
w1.ts;
|
||||
|
||||
-- 8. Filter for specific time periods (Today, This Week, This Month)
|
||||
-- Today's work duration by user
|
||||
SELECT
|
||||
w1."user",
|
||||
CURRENT_DATE AS day,
|
||||
SUM(
|
||||
EXTRACT(EPOCH FROM (w2.ts - w1.ts))/3600
|
||||
) AS duration_hours,
|
||||
MIN(CASE WHEN w1.state = 'working' THEN w1.ts END) AS first_login_time
|
||||
FROM
|
||||
work_events w1
|
||||
JOIN
|
||||
work_events w2 ON w1."user" = w2."user"
|
||||
AND w1.state = 'working'
|
||||
AND w2.state = 'stopped'
|
||||
AND w2.ts > w1.ts
|
||||
AND NOT EXISTS (
|
||||
SELECT 1 FROM work_events w3
|
||||
WHERE w3."user" = w1."user"
|
||||
AND w3.ts > w1.ts AND w3.ts < w2.ts
|
||||
)
|
||||
WHERE
|
||||
DATE(w1.ts) = CURRENT_DATE
|
||||
GROUP BY
|
||||
w1."user"
|
||||
ORDER BY
|
||||
w1."user";
|
||||
|
||||
-- This Week's work duration by user
|
||||
SELECT
|
||||
w1."user",
|
||||
date_trunc('week', CURRENT_DATE) AS week_start,
|
||||
SUM(
|
||||
EXTRACT(EPOCH FROM (w2.ts - w1.ts))/3600
|
||||
) AS duration_hours,
|
||||
MIN(CASE WHEN w1.state = 'working' THEN w1.ts END) AS first_login_time
|
||||
FROM
|
||||
work_events w1
|
||||
JOIN
|
||||
work_events w2 ON w1."user" = w2."user"
|
||||
AND w1.state = 'working'
|
||||
AND w2.state = 'stopped'
|
||||
AND w2.ts > w1.ts
|
||||
AND NOT EXISTS (
|
||||
SELECT 1 FROM work_events w3
|
||||
WHERE w3."user" = w1."user"
|
||||
AND w3.ts > w1.ts AND w3.ts < w2.ts
|
||||
)
|
||||
WHERE
|
||||
date_trunc('week', w1.ts) = date_trunc('week', CURRENT_DATE)
|
||||
GROUP BY
|
||||
w1."user"
|
||||
ORDER BY
|
||||
w1."user";
|
||||
|
||||
-- This Month's work duration by user
|
||||
SELECT
|
||||
w1."user",
|
||||
date_trunc('month', CURRENT_DATE) AS month_start,
|
||||
SUM(
|
||||
EXTRACT(EPOCH FROM (w2.ts - w1.ts))/3600
|
||||
) AS duration_hours,
|
||||
MIN(CASE WHEN w1.state = 'working' THEN w1.ts END) AS first_login_time
|
||||
FROM
|
||||
work_events w1
|
||||
JOIN
|
||||
work_events w2 ON w1."user" = w2."user"
|
||||
AND w1.state = 'working'
|
||||
AND w2.state = 'stopped'
|
||||
AND w2.ts > w1.ts
|
||||
AND NOT EXISTS (
|
||||
SELECT 1 FROM work_events w3
|
||||
WHERE w3."user" = w1."user"
|
||||
AND w3.ts > w1.ts AND w3.ts < w2.ts
|
||||
)
|
||||
WHERE
|
||||
date_trunc('month', w1.ts) = date_trunc('month', CURRENT_DATE)
|
||||
GROUP BY
|
||||
w1."user"
|
||||
ORDER BY
|
||||
w1."user";
|
||||
22
fix_task.cmd
Normal file
22
fix_task.cmd
Normal file
@ -0,0 +1,22 @@
|
||||
@echo off
|
||||
echo Fixing the UserActivityTracking task...
|
||||
|
||||
REM Delete existing task if it exists
|
||||
schtasks /Delete /TN "UserActivityTracking" /F
|
||||
|
||||
REM Create new task with better parameters
|
||||
schtasks /Create /TN "UserActivityTracking" /TR "wscript.exe \"C:\Users\test.user2\Tracking\run_hidden.vbs\"" /SC ONLOGON /RU "INTERACTIVE" /F /RL HIGHEST
|
||||
|
||||
REM Check if task was created successfully
|
||||
schtasks /Query /TN "UserActivityTracking" >nul 2>&1
|
||||
if %ERRORLEVEL% EQU 0 (
|
||||
echo Task created successfully.
|
||||
) else (
|
||||
echo Failed to create task.
|
||||
)
|
||||
|
||||
echo.
|
||||
echo After logging out and back in, check the log at:
|
||||
echo %TEMP%\user_tracking_launcher.log
|
||||
echo.
|
||||
pause
|
||||
85
project_config.md
Normal file
85
project_config.md
Normal file
@ -0,0 +1,85 @@
|
||||
### Project Configuration (LTM)
|
||||
|
||||
This file contains the stable, long-term context for the project. It should be updated infrequently, primarily when core goals, tech, or patterns change.
|
||||
|
||||
## Core Goal
|
||||
|
||||
Implement an automated employee workstation activity tracking system that logs user login events and active/inactive periods on Windows machines, reports these events to a central Flask API, and provides a dashboard with daily, weekly, and monthly working time summaries.
|
||||
|
||||
## Tech Stack
|
||||
|
||||
* **Client**:
|
||||
|
||||
* OS: Windows 10/11
|
||||
* Scripting: PowerShell
|
||||
* Scheduler: Windows Task Scheduler
|
||||
* **Server**:
|
||||
|
||||
* Language: Python 3.9+
|
||||
* Framework: Flask
|
||||
* ORM: SQLAlchemy
|
||||
* Database: SQLite (development) / PostgreSQL (production)
|
||||
* Server: Gunicorn / `flask run`
|
||||
* **Frontend**:
|
||||
|
||||
* HTML, CSS, JavaScript
|
||||
* Charting Library: Chart.js
|
||||
* **Infrastructure**:
|
||||
|
||||
* Host: On-premises LAN server
|
||||
* Networking: HTTP on corporate LAN (no external exposure)
|
||||
* **DevOps & Tooling**:
|
||||
|
||||
* Version Control: Git
|
||||
* Testing: pytest
|
||||
* Database Migrations: Alembic (for PostgreSQL)
|
||||
* Linting/Formatting: Flake8, Black, isort
|
||||
|
||||
## Critical Patterns & Conventions
|
||||
|
||||
* **API Design**:
|
||||
|
||||
* RESTful endpoint: `POST /api/report`
|
||||
* Request payload: JSON with fields `user`, `state`, `ts` (ISO 8601)
|
||||
* Response format: JSON `{ "success": bool, "message"?: "error details" }`, HTTP 201 on success
|
||||
* **Database Model**:
|
||||
|
||||
* Table `work_events` with columns:
|
||||
|
||||
* `id` (PK, auto-increment)
|
||||
* `user` (VARCHAR)
|
||||
* `state` (VARCHAR, e.g., "working"/"stopped")
|
||||
* `ts` (DATETIME, default now)
|
||||
* **Idle Detection Logic**:
|
||||
|
||||
* Fixed inactivity threshold of 5 minutes
|
||||
* State transitions: working → stopped on crossing threshold, stopped → working on activity
|
||||
* **Error Handling**:
|
||||
|
||||
* **PowerShell**: Wrap API calls in `try/catch`, log failures, optional retry
|
||||
* **Flask**: Global error handlers returning structured JSON errors
|
||||
* **Configuration Management**:
|
||||
|
||||
* Environment variables or `.env` file for API server URL, port, and DB connection string
|
||||
* **Coding Standards**:
|
||||
|
||||
* **Python**: PEP 8, docstrings for modules and functions
|
||||
* **PowerShell**: Verb-Noun function names, consistent two-space indentation
|
||||
* **JavaScript**: ES6+ syntax, modular code, avoid globals
|
||||
* **Commit Messages**:
|
||||
|
||||
* Use Conventional Commits format (e.g., `feat: add idle detection logic`)
|
||||
|
||||
## Key Constraints
|
||||
|
||||
* **Client Runtime**: Must rely solely on built-in PowerShell and Task Scheduler (no external installs)
|
||||
* **Idle Threshold**: Exactly 5 minutes of inactivity
|
||||
* **Authentication**: No API auth; system is LAN-restricted
|
||||
* **Deployment**: Server must run on-premises within corporate LAN
|
||||
* **Data Retention**: Raw event logs retained for 1 year; aggregated summaries retained indefinitely
|
||||
* **Performance**: Support up to 100 client reports per minute without degradation
|
||||
|
||||
## Tokenization Settings
|
||||
|
||||
Estimation Method: Character-based
|
||||
Characters Per Token (Estimate): 4
|
||||
308
report.ps1
Normal file
308
report.ps1
Normal file
@ -0,0 +1,308 @@
|
||||
#Requires -Version 5.1
|
||||
<#
|
||||
.SYNOPSIS
|
||||
Tracks user activity and reports working/stopped status to a central server.
|
||||
.DESCRIPTION
|
||||
This PowerShell script monitors user idle time and reports state changes
|
||||
to a central Flask API. It runs at user logon via Task Scheduler and
|
||||
maintains a state machine between "working" and "stopped" states based
|
||||
on a 5-minute inactivity threshold.
|
||||
.NOTES
|
||||
File Name : report.ps1
|
||||
Author : IT Department
|
||||
Version : 1.2
|
||||
#>
|
||||
|
||||
# Hide the PowerShell console window
|
||||
Add-Type -Name Window -Namespace Console -MemberDefinition '
|
||||
[DllImport("Kernel32.dll")]
|
||||
public static extern IntPtr GetConsoleWindow();
|
||||
[DllImport("user32.dll")]
|
||||
public static extern bool ShowWindow(IntPtr hWnd, Int32 nCmdShow);
|
||||
'
|
||||
$consolePtr = [Console.Window]::GetConsoleWindow()
|
||||
[void][Console.Window]::ShowWindow($consolePtr, 0) # 0 = hide
|
||||
|
||||
# Configuration
|
||||
$ApiUrl = "http://localhost:5000/api/report" # Default value, can be overridden by environment variable
|
||||
$IdleThresholdMinutes = 5
|
||||
$LogFilePath = Join-Path $env:TEMP "user_work_tracking_client.log"
|
||||
$pollIntervalSeconds = 60 # Check every minute
|
||||
$reportIntervalMinutes = 5 # Send reports every 5 minutes regardless of state changes
|
||||
$lastReportTime = [DateTime]::MinValue # Initialize last report time
|
||||
|
||||
# Helper Function for Logging
|
||||
function Write-Log($Message) {
|
||||
$timestamp = Get-Date -Format 'yyyy-MM-dd HH:mm:ss'
|
||||
try {
|
||||
"$timestamp - $Message" | Out-File -Append -FilePath $LogFilePath -Encoding UTF8 -ErrorAction Stop
|
||||
} catch {
|
||||
Write-Warning "Failed to write to log file '$LogFilePath': $($_.Exception.Message)"
|
||||
# Optionally, write to console as fallback
|
||||
Write-Host "$timestamp [LOG FALLBACK] - $Message"
|
||||
}
|
||||
}
|
||||
|
||||
Write-Log "================ Script Started ==================="
|
||||
|
||||
# Try to read config from environment or a local config file
|
||||
try {
|
||||
# Check for environment variable first
|
||||
if ($env:API_ENDPOINT) {
|
||||
$ApiUrl = $env:API_ENDPOINT
|
||||
Write-Log "Using API URL from environment variable: $ApiUrl"
|
||||
}
|
||||
else {
|
||||
# Look for a config file
|
||||
$configPath = Join-Path ([System.IO.Path]::GetDirectoryName($PSCommandPath)) "config.env"
|
||||
if (Test-Path $configPath) {
|
||||
Write-Log "Found config file at $configPath"
|
||||
Get-Content $configPath | ForEach-Object {
|
||||
if ($_ -match '^API_ENDPOINT=(.*)$') {
|
||||
$ApiUrl = $matches[1].Trim('"')
|
||||
Write-Log "Using API URL from config file: $ApiUrl"
|
||||
}
|
||||
if ($_ -match '^IDLE_THRESHOLD_MINUTES=(\d+)$') {
|
||||
$IdleThresholdMinutes = [int]$matches[1]
|
||||
Write-Log "Using idle threshold from config file: $IdleThresholdMinutes minutes"
|
||||
}
|
||||
if ($_ -match '^POLL_INTERVAL_SECONDS=(\d+)$') {
|
||||
$pollIntervalSeconds = [int]$matches[1]
|
||||
Write-Log "Using poll interval from config file: $pollIntervalSeconds seconds"
|
||||
}
|
||||
if ($_ -match '^REPORT_INTERVAL_MINUTES=(\d+)$') {
|
||||
$reportIntervalMinutes = [int]$matches[1]
|
||||
Write-Log "Using report interval from config file: $reportIntervalMinutes minutes"
|
||||
}
|
||||
}
|
||||
}
|
||||
else {
|
||||
Write-Log "No config file found. Using default values."
|
||||
}
|
||||
}
|
||||
}
|
||||
catch {
|
||||
Write-Log "Error reading configuration: $_. Using default values."
|
||||
}
|
||||
|
||||
Write-Log "Using API URL: $ApiUrl"
|
||||
Write-Log "Idle Threshold (Minutes): $IdleThresholdMinutes"
|
||||
Write-Log "Report Interval (Minutes): $reportIntervalMinutes"
|
||||
|
||||
# Initiate state
|
||||
$currentState = "working" # Start in working state (user just logged in)
|
||||
$lastReportedState = $null
|
||||
|
||||
# Function to get idle time using quser command
|
||||
function Get-IdleTime {
|
||||
try {
|
||||
# First try the Win32 API method (more reliable)
|
||||
if (-not ([System.Management.Automation.PSTypeName]'IdleTime').Type) {
|
||||
Add-Type @'
|
||||
using System;
|
||||
using System.Runtime.InteropServices;
|
||||
|
||||
public class IdleTime {
|
||||
[DllImport("user32.dll")]
|
||||
public static extern bool GetLastInputInfo(ref LASTINPUTINFO plii);
|
||||
|
||||
public static TimeSpan GetIdleTime() {
|
||||
LASTINPUTINFO lastInput = new LASTINPUTINFO();
|
||||
lastInput.cbSize = (uint)Marshal.SizeOf(lastInput);
|
||||
|
||||
if (GetLastInputInfo(ref lastInput)) {
|
||||
uint currentTickCount = (uint)Environment.TickCount;
|
||||
uint idleTicks = currentTickCount - lastInput.dwTime;
|
||||
return TimeSpan.FromMilliseconds(idleTicks);
|
||||
} else {
|
||||
return TimeSpan.Zero;
|
||||
}
|
||||
}
|
||||
|
||||
[StructLayout(LayoutKind.Sequential)]
|
||||
public struct LASTINPUTINFO {
|
||||
public uint cbSize;
|
||||
public uint dwTime;
|
||||
}
|
||||
}
|
||||
'@
|
||||
}
|
||||
|
||||
# Use the Win32 API method
|
||||
$idleTime = [IdleTime]::GetIdleTime()
|
||||
$idleMinutes = $idleTime.TotalMinutes
|
||||
Write-Log "Win32 API reports idle time: $idleMinutes minutes"
|
||||
return $idleTime
|
||||
}
|
||||
catch {
|
||||
Write-Log "Error using Win32 API for idle time: $_. Falling back to quser"
|
||||
|
||||
# Fallback: Try the quser command method
|
||||
try {
|
||||
$quser = quser $env:USERNAME | Select-Object -Skip 1
|
||||
$idleText = ($quser -replace '\s{2,}', ',').Split(',')[3]
|
||||
|
||||
# If no idle time is reported, return 0
|
||||
if ($idleText -eq "." -or $idleText -eq "none" -or $idleText -eq $null) {
|
||||
Write-Log "quser reports no idle time"
|
||||
return [TimeSpan]::Zero
|
||||
}
|
||||
|
||||
# Parse HH:MM format
|
||||
if ($idleText -match '(\d+):(\d+)') {
|
||||
$hours = [int]$Matches[1]
|
||||
$minutes = [int]$Matches[2]
|
||||
$result = [TimeSpan]::FromMinutes(($hours * 60) + $minutes)
|
||||
Write-Log "quser parsed idle time: $($result.TotalMinutes) minutes"
|
||||
return $result
|
||||
}
|
||||
|
||||
# Parse "MM+" format (represents minutes)
|
||||
if ($idleText -match '(\d+)\+') {
|
||||
$minutes = [int]$Matches[1]
|
||||
$result = [TimeSpan]::FromMinutes($minutes)
|
||||
Write-Log "quser parsed idle time: $($result.TotalMinutes) minutes"
|
||||
return $result
|
||||
}
|
||||
|
||||
# Default to zero if we couldn't parse
|
||||
Write-Log "quser couldn't parse idle format: '$idleText'"
|
||||
return [TimeSpan]::Zero
|
||||
}
|
||||
catch {
|
||||
Write-Log "Error getting idle time via quser: $_"
|
||||
return [TimeSpan]::Zero
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
# Function to report state to the API
|
||||
function Send-StateReport {
|
||||
param (
|
||||
[string]$State
|
||||
)
|
||||
|
||||
$payload = @{
|
||||
user = $env:USERNAME
|
||||
state = $State
|
||||
ts = (Get-Date).ToUniversalTime().ToString("o") # ISO 8601 format
|
||||
} | ConvertTo-Json
|
||||
|
||||
Write-Log "Preparing to send payload: $payload"
|
||||
|
||||
try {
|
||||
Write-Log "Attempting API call to $ApiUrl"
|
||||
# Send to API
|
||||
$response = Invoke-RestMethod -Uri $ApiUrl `
|
||||
-Method Post `
|
||||
-Body $payload `
|
||||
-ContentType "application/json" `
|
||||
-ErrorAction Stop
|
||||
|
||||
Write-Log "API call successful. Response: $($response | ConvertTo-Json -Depth 3)"
|
||||
|
||||
# Update last report time
|
||||
$script:lastReportTime = Get-Date
|
||||
|
||||
# Check for success field in response
|
||||
if ($response.success -eq $true) {
|
||||
Write-Log "State '$State' reported successfully."
|
||||
return $true
|
||||
} else {
|
||||
Write-Log "API indicated failure. Message: $($response.message)"
|
||||
return $false
|
||||
}
|
||||
}
|
||||
catch {
|
||||
# Log error details
|
||||
$errorMessage = "Failed to report state '$State' to API. Error: $($_.Exception.Message)"
|
||||
if ($_.Exception.Response) {
|
||||
$statusCode = $_.Exception.Response.StatusCode
|
||||
$statusDescription = $_.Exception.Response.StatusDescription
|
||||
$errorMessage += " Status Code: $statusCode ($statusDescription)"
|
||||
try {
|
||||
$responseBody = $_.Exception.Response.GetResponseStream()
|
||||
$streamReader = New-Object System.IO.StreamReader($responseBody)
|
||||
$errorBody = $streamReader.ReadToEnd()
|
||||
$streamReader.Close()
|
||||
$responseBody.Close()
|
||||
$errorMessage += " Response Body: $errorBody"
|
||||
} catch {
|
||||
$errorMessage += " (Could not read error response body)"
|
||||
}
|
||||
}
|
||||
Write-Log $errorMessage
|
||||
return $false
|
||||
}
|
||||
}
|
||||
|
||||
# Initial state report at startup (user just logged in = working)
|
||||
if (Send-StateReport -State $currentState) {
|
||||
$lastReportedState = $currentState
|
||||
Write-Log "Initial state reported as '$currentState'"
|
||||
} else {
|
||||
Write-Log "Failed to report initial state. Will retry on next check."
|
||||
}
|
||||
|
||||
# Main monitoring loop
|
||||
Write-Log "Starting activity monitoring with $IdleThresholdMinutes minute idle threshold"
|
||||
|
||||
try {
|
||||
while ($true) {
|
||||
# Get current idle time
|
||||
$idleTime = Get-IdleTime
|
||||
$idleMinutes = $idleTime.TotalMinutes
|
||||
Write-Log "Idle time check: $idleMinutes minutes."
|
||||
|
||||
# If idle time couldn't be determined, log and wait
|
||||
if ($idleTime -lt 0) {
|
||||
Write-Log "Error getting idle time. Waiting for next check."
|
||||
Start-Sleep -Seconds $pollIntervalSeconds
|
||||
continue
|
||||
}
|
||||
|
||||
# Determine state based on idle time
|
||||
$newState = if ($idleMinutes -ge $IdleThresholdMinutes) { "stopped" } else { "working" }
|
||||
Write-Log "Determined state: $newState (Current: $currentState)"
|
||||
|
||||
# Check if it's time to send a periodic report (every 5 minutes)
|
||||
$timeSinceLastReport = (Get-Date) - $lastReportTime
|
||||
$shouldSendPeriodicReport = $timeSinceLastReport.TotalMinutes -ge $reportIntervalMinutes
|
||||
|
||||
Write-Log "Time since last report: $($timeSinceLastReport.TotalMinutes.ToString("F2")) minutes (threshold: $reportIntervalMinutes)"
|
||||
|
||||
# If state changed or it's time for periodic report, send it
|
||||
if ($newState -ne $currentState -or $shouldSendPeriodicReport) {
|
||||
# If it's a state change, update current state
|
||||
if ($newState -ne $currentState) {
|
||||
Write-Log "State changed from '$currentState' to '$newState'. Reporting to API."
|
||||
$currentState = $newState
|
||||
} else {
|
||||
Write-Log "Sending periodic report (current state: $currentState)"
|
||||
}
|
||||
|
||||
# Report to API
|
||||
try {
|
||||
if (Send-StateReport -State $currentState) {
|
||||
$lastReportedState = $currentState
|
||||
}
|
||||
}
|
||||
catch {
|
||||
Write-Log "Error reporting state: $_"
|
||||
}
|
||||
}
|
||||
|
||||
# Sleep until next check
|
||||
Start-Sleep -Seconds $pollIntervalSeconds
|
||||
}
|
||||
}
|
||||
catch {
|
||||
Write-Log "Critical error in main loop: $_"
|
||||
# Try to report stopped state before exiting (if we were working)
|
||||
if ($currentState -eq "working") {
|
||||
Send-StateReport -State "stopped"
|
||||
}
|
||||
exit 1
|
||||
}
|
||||
|
||||
Write-Log "================ Script Ended Gracefully ================"
|
||||
15
requirements.txt
Normal file
15
requirements.txt
Normal file
@ -0,0 +1,15 @@
|
||||
Flask==2.3.3
|
||||
Flask-SQLAlchemy==3.1.1
|
||||
SQLAlchemy==2.0.20
|
||||
gunicorn==21.2.0
|
||||
python-dotenv==1.0.0
|
||||
psycopg2-binary==2.9.7
|
||||
pytest==7.4.0
|
||||
black==23.7.0
|
||||
flake8==6.1.0
|
||||
isort==5.12.0
|
||||
alembic==1.12.0
|
||||
Werkzeug==2.3.7
|
||||
Jinja2==3.1.2
|
||||
itsdangerous==2.1.2
|
||||
click==8.1.7
|
||||
55
run.py
Normal file
55
run.py
Normal file
@ -0,0 +1,55 @@
|
||||
"""
|
||||
Entry point for the Employee Workstation Activity Tracking application.
|
||||
|
||||
This script initializes the Flask application and runs the development server.
|
||||
"""
|
||||
import os
|
||||
import sys
|
||||
from app import create_app, init_db
|
||||
|
||||
# Create the application instance using the factory function
|
||||
app = create_app()
|
||||
|
||||
if __name__ == '__main__':
|
||||
# Print some diagnostic information
|
||||
print(f"Starting application...")
|
||||
print(f"Instance path: {app.instance_path}")
|
||||
print(f"Instance path exists: {os.path.exists(app.instance_path)}")
|
||||
|
||||
# Make sure the instance directory exists
|
||||
if not os.path.exists(app.instance_path):
|
||||
try:
|
||||
os.makedirs(app.instance_path)
|
||||
print(f"Created instance directory: {app.instance_path}")
|
||||
except Exception as e:
|
||||
print(f"Error creating instance directory: {e}")
|
||||
|
||||
# Check instance directory permissions
|
||||
has_write_access = os.access(app.instance_path, os.W_OK)
|
||||
print(f"Instance path write access: {has_write_access}")
|
||||
|
||||
# Print database configuration
|
||||
db_uri = app.config['SQLALCHEMY_DATABASE_URI']
|
||||
masked_uri = db_uri
|
||||
if '@' in db_uri:
|
||||
parts = db_uri.split('@')
|
||||
masked_uri = "****@" + parts[1]
|
||||
print(f"Database URI: {masked_uri}")
|
||||
|
||||
# Initialize the database if needed
|
||||
try:
|
||||
print("Initializing database...")
|
||||
init_db()
|
||||
print("Database initialization successful")
|
||||
except Exception as e:
|
||||
print(f"Error during database initialization: {e}")
|
||||
import traceback
|
||||
traceback.print_exc()
|
||||
|
||||
# Run the Flask application
|
||||
host = os.environ.get('HOST', '0.0.0.0')
|
||||
port = int(os.environ.get('PORT', 5000))
|
||||
debug = os.environ.get('DEBUG', 'False').lower() == 'true'
|
||||
|
||||
print(f"Starting Flask application on {host}:{port} (debug={debug})")
|
||||
app.run(host=host, port=port, debug=debug)
|
||||
43
run_hidden.vbs
Normal file
43
run_hidden.vbs
Normal file
@ -0,0 +1,43 @@
|
||||
Set WshShell = CreateObject("WScript.Shell")
|
||||
Set FSO = CreateObject("Scripting.FileSystemObject")
|
||||
|
||||
' Create a log file to help with troubleshooting
|
||||
LogPath = FSO.BuildPath(FSO.GetSpecialFolder(2), "user_tracking_launcher.log") ' Temp folder
|
||||
Set LogFile = FSO.OpenTextFile(LogPath, 8, True) ' 8 = ForAppending
|
||||
LogFile.WriteLine(Now & " - VBS Launcher started")
|
||||
|
||||
' Get the current script directory
|
||||
ScriptDir = FSO.GetParentFolderName(WScript.ScriptFullName)
|
||||
LogFile.WriteLine(Now & " - Script directory: " & ScriptDir)
|
||||
|
||||
' Get the path to the PowerShell script
|
||||
PowerShellScript = FSO.BuildPath(ScriptDir, "report.ps1")
|
||||
LogFile.WriteLine(Now & " - PowerShell script path: " & PowerShellScript)
|
||||
|
||||
' Check if the PowerShell script exists
|
||||
If FSO.FileExists(PowerShellScript) Then
|
||||
LogFile.WriteLine(Now & " - PowerShell script exists")
|
||||
Else
|
||||
LogFile.WriteLine(Now & " - ERROR: PowerShell script does not exist!")
|
||||
End If
|
||||
|
||||
' Build the command
|
||||
PowerShellPath = "powershell.exe"
|
||||
ScriptPath = """" & PowerShellScript & """"
|
||||
Cmd = PowerShellPath & " -NoProfile -ExecutionPolicy Bypass -WindowStyle Hidden -File " & ScriptPath
|
||||
LogFile.WriteLine(Now & " - Command: " & Cmd)
|
||||
|
||||
' Run the command
|
||||
On Error Resume Next
|
||||
WshShell.Run Cmd, 0, False
|
||||
If Err.Number <> 0 Then
|
||||
LogFile.WriteLine(Now & " - ERROR: " & Err.Description)
|
||||
Else
|
||||
LogFile.WriteLine(Now & " - Command executed successfully")
|
||||
End If
|
||||
On Error Goto 0
|
||||
|
||||
LogFile.Close
|
||||
Set LogFile = Nothing
|
||||
Set FSO = Nothing
|
||||
Set WshShell = Nothing
|
||||
104
schedule_task.ps1
Normal file
104
schedule_task.ps1
Normal file
@ -0,0 +1,104 @@
|
||||
#Requires -Version 3.0
|
||||
#Requires -RunAsAdministrator
|
||||
<#
|
||||
.SYNOPSIS
|
||||
Creates a scheduled task to run the user activity tracking script at logon.
|
||||
.DESCRIPTION
|
||||
This script creates a Windows Task Scheduler task that launches the
|
||||
report.ps1 script whenever a user logs into the system. It must be
|
||||
run with administrative privileges.
|
||||
.NOTES
|
||||
File Name : schedule_task.ps1
|
||||
Author : IT Department
|
||||
Version : 1.4
|
||||
.EXAMPLE
|
||||
.\schedule_task.ps1 -ScriptPath "C:\Scripts\report.ps1"
|
||||
|
||||
# To avoid security warnings, you can unblock the files first:
|
||||
# Unblock-File -Path schedule_task.ps1
|
||||
# Unblock-File -Path report.ps1
|
||||
#>
|
||||
|
||||
param (
|
||||
[Parameter(Mandatory=$true)]
|
||||
[string]$ScriptPath
|
||||
)
|
||||
|
||||
# Begin by unblocking the files to prevent security warnings
|
||||
Write-Host "Attempting to unblock script files..."
|
||||
try {
|
||||
Unblock-File -Path $ScriptPath -ErrorAction SilentlyContinue
|
||||
Unblock-File -Path $MyInvocation.MyCommand.Path -ErrorAction SilentlyContinue
|
||||
Write-Host "Files unblocked successfully."
|
||||
} catch {
|
||||
Write-Host "Unable to unblock files. If you see security warnings, you may need to run the following manually:"
|
||||
Write-Host "Unblock-File -Path $ScriptPath"
|
||||
}
|
||||
|
||||
# Verify the script exists
|
||||
if (-not (Test-Path $ScriptPath)) {
|
||||
Write-Error "Script not found at path: $ScriptPath"
|
||||
exit 1
|
||||
}
|
||||
|
||||
# Make sure the path is absolute
|
||||
$ScriptPath = (Resolve-Path $ScriptPath).Path
|
||||
|
||||
# Task settings
|
||||
$taskName = "UserActivityTracking"
|
||||
$taskDescription = "Monitors user activity and reports to central server"
|
||||
|
||||
# Check if the task already exists and remove it
|
||||
$existingTask = Get-ScheduledTask -TaskName $taskName -ErrorAction SilentlyContinue
|
||||
if ($existingTask) {
|
||||
Write-Host "Task '$taskName' already exists. Removing it..."
|
||||
schtasks /Delete /TN $taskName /F
|
||||
}
|
||||
|
||||
# Create the PowerShell command to be executed
|
||||
$powershellCmd = "-NoProfile -ExecutionPolicy Bypass -WindowStyle Hidden -File `"$ScriptPath`""
|
||||
|
||||
# Use schtasks.exe command to create the task (more reliable than PowerShell cmdlets)
|
||||
try {
|
||||
$createCmd = "schtasks /Create /TN $taskName /TR `"powershell.exe $powershellCmd`" /SC ONLOGON /F /RU INTERACTIVE /IT /DELAY 0000:30"
|
||||
Write-Host "Creating scheduled task with command: $createCmd"
|
||||
Invoke-Expression $createCmd
|
||||
|
||||
# Verify task was created
|
||||
$taskExists = schtasks /Query /TN $taskName 2>$null
|
||||
if ($LASTEXITCODE -eq 0) {
|
||||
Write-Host "Task '$taskName' has been created successfully."
|
||||
Write-Host "The activity tracking script will now run at each user logon."
|
||||
Write-Host "Script path: $ScriptPath"
|
||||
} else {
|
||||
Write-Error "Task creation verification failed. The task may not have been created properly."
|
||||
}
|
||||
} catch {
|
||||
Write-Error "Failed to create scheduled task: $_"
|
||||
exit 1
|
||||
}
|
||||
|
||||
# Copy config.env next to script file if it exists
|
||||
$scriptDir = Split-Path -Parent $ScriptPath
|
||||
$configEnvSource = Join-Path (Get-Location) "config.env"
|
||||
$configEnvDest = Join-Path $scriptDir "config.env"
|
||||
|
||||
if (Test-Path $configEnvSource) {
|
||||
# Check if source and destination are different
|
||||
if ((Resolve-Path $configEnvSource).Path -ne (Resolve-Path $configEnvDest -ErrorAction SilentlyContinue).Path) {
|
||||
try {
|
||||
Copy-Item -Path $configEnvSource -Destination $configEnvDest -Force
|
||||
Write-Host "Copied config.env to script directory: $configEnvDest"
|
||||
}
|
||||
catch {
|
||||
Write-Warning "Failed to copy config.env to script directory: $_"
|
||||
}
|
||||
} else {
|
||||
Write-Host "Config.env is already in the script directory."
|
||||
}
|
||||
}
|
||||
else {
|
||||
Write-Warning "config.env not found in current directory. Make sure to manually configure the script."
|
||||
}
|
||||
|
||||
Write-Host "`nIMPORTANT: To manually verify the task was created correctly, open Task Scheduler`n(taskschd.msc) and check for the 'UserActivityTracking' task."
|
||||
62
static/css/dashboard.css
Normal file
62
static/css/dashboard.css
Normal file
@ -0,0 +1,62 @@
|
||||
body {
|
||||
font-family: 'Segoe UI', Tahoma, Geneva, Verdana, sans-serif;
|
||||
background-color: #f8f9fa;
|
||||
padding: 20px;
|
||||
}
|
||||
.card {
|
||||
box-shadow: 0 4px 6px rgba(0, 0, 0, 0.1);
|
||||
margin-bottom: 20px;
|
||||
}
|
||||
.card-header {
|
||||
background-color: #e9ecef; /* Slightly darker header */
|
||||
font-weight: 600;
|
||||
}
|
||||
.btn-filter {
|
||||
margin-right: 5px;
|
||||
}
|
||||
.table th {
|
||||
background-color: #e9ecef; /* Match card header */
|
||||
font-weight: 600;
|
||||
}
|
||||
#reportTable th:nth-child(1), /* User column */
|
||||
#reportTable td:nth-child(1) {
|
||||
width: 20%;
|
||||
}
|
||||
#reportTable th:nth-child(2), /* Period column */
|
||||
#reportTable td:nth-child(2) {
|
||||
width: 30%;
|
||||
}
|
||||
#reportTable th:nth-child(3), /* Duration column */
|
||||
#reportTable td:nth-child(3) {
|
||||
width: 20%;
|
||||
text-align: center;
|
||||
}
|
||||
#reportTable th:nth-child(4), /* First Login Time column */
|
||||
#reportTable td:nth-child(4) {
|
||||
width: 30%;
|
||||
text-align: center;
|
||||
}
|
||||
.spinner-border {
|
||||
width: 1.5rem;
|
||||
height: 1.5rem;
|
||||
}
|
||||
.user-link {
|
||||
color: #0d6efd;
|
||||
text-decoration: none;
|
||||
cursor: pointer;
|
||||
}
|
||||
.user-link:hover {
|
||||
text-decoration: underline;
|
||||
}
|
||||
.modal-body table th:nth-child(1) {
|
||||
width: 25%;
|
||||
}
|
||||
.modal-body table th:nth-child(2),
|
||||
.modal-body table th:nth-child(3) {
|
||||
width: 30%;
|
||||
}
|
||||
.modal-body table th:nth-child(4),
|
||||
.modal-body table td:nth-child(4) {
|
||||
width: 15%;
|
||||
text-align: center;
|
||||
}
|
||||
446
static/js/dashboard.js
Normal file
446
static/js/dashboard.js
Normal file
@ -0,0 +1,446 @@
|
||||
document.addEventListener('DOMContentLoaded', function() {
|
||||
const filterButtons = document.querySelectorAll('.btn-filter');
|
||||
const userFilterInput = document.getElementById('userFilter');
|
||||
const clearUserFilterBtn = document.getElementById('clearUserFilter');
|
||||
const reportBody = document.getElementById('reportBody');
|
||||
const periodHeader = document.getElementById('periodHeader');
|
||||
const loadingSpinner = document.getElementById('loadingSpinner');
|
||||
const errorMessage = document.getElementById('errorMessage');
|
||||
const weekDaySelector = document.getElementById('weekDaySelector');
|
||||
const weekDaySelect = document.getElementById('weekDaySelect');
|
||||
|
||||
// User activity modal elements
|
||||
const userActivityModal = document.getElementById('userActivityModal');
|
||||
const modalUsername = document.getElementById('modalUsername');
|
||||
const startDateInput = document.getElementById('startDate');
|
||||
const endDateInput = document.getElementById('endDate');
|
||||
const applyDateRangeBtn = document.getElementById('applyDateRange');
|
||||
const userActivityBody = document.getElementById('userActivityBody');
|
||||
const modalLoadingSpinner = document.getElementById('modalLoadingSpinner');
|
||||
const modalErrorMessage = document.getElementById('modalErrorMessage');
|
||||
|
||||
// Initialize date inputs with current date
|
||||
const today = new Date().toISOString().split('T')[0];
|
||||
startDateInput.value = today;
|
||||
endDateInput.value = today;
|
||||
|
||||
let currentPeriod = 'daily'; // Default period
|
||||
let userActivityModalInstance = null;
|
||||
let selectedWeekDay = null;
|
||||
let currentDate = new Date(); // Track the currently selected date
|
||||
let selectedDate = formatDateForAPI(new Date()); // Current date in YYYY-MM-DD format
|
||||
|
||||
// User filter debounce timer
|
||||
let userFilterTimeout = null;
|
||||
|
||||
// Format date as YYYY-MM-DD for API
|
||||
function formatDateForAPI(date) {
|
||||
return date.toISOString().split('T')[0];
|
||||
}
|
||||
|
||||
// Format date for display (DD/MM/YYYY)
|
||||
function formatDateForDisplay(dateStr) {
|
||||
const date = new Date(dateStr);
|
||||
return date.toLocaleDateString('en-GB', {
|
||||
day: '2-digit',
|
||||
month: '2-digit',
|
||||
year: 'numeric'
|
||||
});
|
||||
}
|
||||
|
||||
// Update the current date display in the UI
|
||||
function updateDateDisplay() {
|
||||
const currentDateDisplay = document.getElementById('currentDateDisplay');
|
||||
currentDateDisplay.textContent = formatDateForDisplay(selectedDate);
|
||||
}
|
||||
|
||||
// Function to navigate to previous day
|
||||
function goToPreviousDay() {
|
||||
currentDate.setDate(currentDate.getDate() - 1);
|
||||
selectedDate = formatDateForAPI(currentDate);
|
||||
updateDateDisplay();
|
||||
loadReportData();
|
||||
}
|
||||
|
||||
// Function to navigate to next day
|
||||
function goToNextDay() {
|
||||
currentDate.setDate(currentDate.getDate() + 1);
|
||||
selectedDate = formatDateForAPI(currentDate);
|
||||
updateDateDisplay();
|
||||
loadReportData();
|
||||
}
|
||||
|
||||
// Function to handle date selection from calendar
|
||||
function handleDateSelection() {
|
||||
const dateSelector = document.getElementById('dateSelector');
|
||||
if (dateSelector.value) {
|
||||
selectedDate = dateSelector.value;
|
||||
currentDate = new Date(selectedDate);
|
||||
updateDateDisplay();
|
||||
dateSelector.style.display = 'none';
|
||||
loadReportData();
|
||||
}
|
||||
}
|
||||
|
||||
// Function to toggle calendar visibility
|
||||
function toggleCalendar() {
|
||||
const dateSelector = document.getElementById('dateSelector');
|
||||
if (dateSelector.style.display === 'none') {
|
||||
dateSelector.style.display = 'inline-block';
|
||||
dateSelector.value = selectedDate;
|
||||
dateSelector.focus();
|
||||
} else {
|
||||
dateSelector.style.display = 'none';
|
||||
}
|
||||
}
|
||||
|
||||
// Function to get days of the current week (Monday to Sunday)
|
||||
function getDaysOfWeek() {
|
||||
const today = new Date();
|
||||
const currentDay = today.getDay(); // 0 is Sunday, 1 is Monday, ...
|
||||
const mondayOffset = currentDay === 0 ? -6 : 1 - currentDay; // Calculate days from today to Monday
|
||||
|
||||
const days = [];
|
||||
const monday = new Date(today);
|
||||
monday.setDate(today.getDate() + mondayOffset);
|
||||
|
||||
// Generate array with dates for Monday through Sunday
|
||||
for (let i = 0; i < 7; i++) {
|
||||
const date = new Date(monday);
|
||||
date.setDate(monday.getDate() + i);
|
||||
days.push({
|
||||
date: date.toISOString().split('T')[0], // YYYY-MM-DD format
|
||||
dayName: date.toLocaleDateString('en-US', { weekday: 'long' }), // Monday, Tuesday, etc.
|
||||
displayDate: date.toLocaleDateString('en-GB') // DD/MM/YYYY format for display
|
||||
});
|
||||
}
|
||||
return days;
|
||||
}
|
||||
|
||||
// Function to populate the week day selector dropdown
|
||||
function populateWeekDaySelector() {
|
||||
weekDaySelect.innerHTML = '<option value="">All Week</option>';
|
||||
const days = getDaysOfWeek();
|
||||
|
||||
days.forEach(day => {
|
||||
const option = document.createElement('option');
|
||||
option.value = day.date;
|
||||
option.textContent = `${day.dayName} (${day.displayDate})`;
|
||||
weekDaySelect.appendChild(option);
|
||||
});
|
||||
}
|
||||
|
||||
// Function to fetch and display report data
|
||||
async function loadReportData() {
|
||||
loadingSpinner.classList.remove('d-none'); // Show spinner
|
||||
errorMessage.classList.add('d-none'); // Hide error message
|
||||
reportBody.innerHTML = ''; // Clear previous data
|
||||
|
||||
const user = userFilterInput.value.trim();
|
||||
let apiUrl = `/api/reports/${currentPeriod}`;
|
||||
const params = new URLSearchParams();
|
||||
|
||||
if (user) {
|
||||
params.append('user', user);
|
||||
}
|
||||
|
||||
// Add date parameter for daily view
|
||||
if (currentPeriod === 'daily') {
|
||||
params.append('date', selectedDate);
|
||||
}
|
||||
|
||||
// Add selected day parameter for weekly view if a specific day is selected
|
||||
if (currentPeriod === 'weekly' && selectedWeekDay) {
|
||||
params.append('day', selectedWeekDay);
|
||||
}
|
||||
|
||||
if (params.toString()) {
|
||||
apiUrl += `?${params.toString()}`;
|
||||
}
|
||||
|
||||
try {
|
||||
const response = await fetch(apiUrl);
|
||||
if (!response.ok) {
|
||||
throw new Error(`HTTP error! status: ${response.status}`);
|
||||
}
|
||||
const result = await response.json();
|
||||
|
||||
if (result.success && result.data) {
|
||||
populateTable(result.data);
|
||||
} else {
|
||||
showError(result.message || 'Failed to load data.');
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Error fetching report data:', error);
|
||||
showError('Network error or failed to fetch data.');
|
||||
} finally {
|
||||
loadingSpinner.classList.add('d-none'); // Hide spinner
|
||||
}
|
||||
}
|
||||
|
||||
// Function to populate the table with data
|
||||
function populateTable(data) {
|
||||
// Update table header based on period
|
||||
switch (currentPeriod) {
|
||||
case 'daily': periodHeader.textContent = 'Day'; break;
|
||||
case 'weekly': periodHeader.textContent = 'Week'; break;
|
||||
case 'monthly': periodHeader.textContent = 'Month Starting'; break;
|
||||
default: periodHeader.textContent = 'Period';
|
||||
}
|
||||
|
||||
// Show/hide First Login Time column based on period and day selection
|
||||
const firstLoginHeader = document.querySelector('#reportTable th:nth-child(4)');
|
||||
// Show First Login Time for daily view or when a specific day is selected in weekly view
|
||||
const hideLoginTime = currentPeriod === 'monthly' ||
|
||||
(currentPeriod === 'weekly' && !selectedWeekDay);
|
||||
|
||||
if (hideLoginTime) {
|
||||
// Hide First Login Time column for monthly view and weekly view without day selection
|
||||
firstLoginHeader.style.display = 'none';
|
||||
} else {
|
||||
// Show First Login Time column for daily view or when a specific day is selected
|
||||
firstLoginHeader.style.display = '';
|
||||
}
|
||||
|
||||
if (data.length === 0) {
|
||||
reportBody.innerHTML = '<tr><td colspan="4" class="text-center text-muted">No data found for the selected filters.</td></tr>';
|
||||
return;
|
||||
}
|
||||
|
||||
data.forEach(entry => {
|
||||
const row = document.createElement('tr');
|
||||
let periodValue = '';
|
||||
// Determine which period key to use based on currentPeriod
|
||||
switch (currentPeriod) {
|
||||
case 'daily': periodValue = entry.day; break;
|
||||
case 'weekly': periodValue = entry.week_start; break;
|
||||
case 'monthly': periodValue = entry.month_start; break;
|
||||
}
|
||||
|
||||
// Format date string from ISO (YYYY-MM-DD) to DD/MM/YYYY
|
||||
let formattedPeriod = periodValue;
|
||||
if (periodValue && periodValue.match(/^\d{4}-\d{2}-\d{2}/)) {
|
||||
const dateParts = periodValue.substring(0, 10).split('-');
|
||||
formattedPeriod = `${dateParts[2]}/${dateParts[1]}/${dateParts[0]}`;
|
||||
}
|
||||
|
||||
// Format first login time for display (from ISO to local time)
|
||||
let firstLoginTime = 'N/A';
|
||||
if (entry.first_login_time) {
|
||||
const loginDate = new Date(entry.first_login_time);
|
||||
firstLoginTime = loginDate.toLocaleTimeString([], {
|
||||
hour: '2-digit',
|
||||
minute: '2-digit',
|
||||
hour12: true
|
||||
});
|
||||
}
|
||||
|
||||
if (hideLoginTime) {
|
||||
// Don't include First Login Time cell in monthly and weekly view without day selection
|
||||
row.innerHTML = `
|
||||
<td><a class="user-link" data-user="${entry.user}">${entry.user}</a></td>
|
||||
<td>${formattedPeriod || 'N/A'}</td>
|
||||
<td>${entry.duration_hours !== null ? entry.duration_hours.toFixed(2) : 'N/A'}</td>
|
||||
`;
|
||||
} else {
|
||||
row.innerHTML = `
|
||||
<td><a class="user-link" data-user="${entry.user}">${entry.user}</a></td>
|
||||
<td>${formattedPeriod || 'N/A'}</td>
|
||||
<td>${entry.duration_hours !== null ? entry.duration_hours.toFixed(2) : 'N/A'}</td>
|
||||
<td>${firstLoginTime}</td>
|
||||
`;
|
||||
}
|
||||
reportBody.appendChild(row);
|
||||
});
|
||||
|
||||
// Add event listeners to user links
|
||||
document.querySelectorAll('.user-link').forEach(link => {
|
||||
link.addEventListener('click', function() {
|
||||
const username = this.getAttribute('data-user');
|
||||
showUserActivityModal(username);
|
||||
});
|
||||
});
|
||||
}
|
||||
|
||||
// Function to show error messages
|
||||
function showError(message) {
|
||||
errorMessage.textContent = `Error: ${message}`;
|
||||
errorMessage.classList.remove('d-none');
|
||||
reportBody.innerHTML = '<tr><td colspan="4" class="text-center text-danger">Failed to load data.</td></tr>';
|
||||
}
|
||||
|
||||
// Function to show user activity modal
|
||||
function showUserActivityModal(username) {
|
||||
modalUsername.textContent = username;
|
||||
userActivityBody.innerHTML = '';
|
||||
modalErrorMessage.classList.add('d-none');
|
||||
|
||||
// Initialize and show modal with Bootstrap 5
|
||||
if (!userActivityModalInstance) {
|
||||
userActivityModalInstance = new bootstrap.Modal(userActivityModal);
|
||||
}
|
||||
userActivityModalInstance.show();
|
||||
|
||||
// Load user activity data
|
||||
loadUserActivityData(username);
|
||||
}
|
||||
|
||||
// Function to load user activity data
|
||||
async function loadUserActivityData(username) {
|
||||
modalLoadingSpinner.classList.remove('d-none');
|
||||
userActivityBody.innerHTML = '';
|
||||
modalErrorMessage.classList.add('d-none');
|
||||
|
||||
const startDate = startDateInput.value;
|
||||
const endDate = endDateInput.value;
|
||||
|
||||
try {
|
||||
const response = await fetch(`/api/user-activity/${encodeURIComponent(username)}?start_date=${startDate}&end_date=${endDate}`);
|
||||
if (!response.ok) {
|
||||
throw new Error(`HTTP error! status: ${response.status}`);
|
||||
}
|
||||
|
||||
const result = await response.json();
|
||||
if (result.success && result.data) {
|
||||
populateUserActivityTable(result.data.activities);
|
||||
} else {
|
||||
showModalError(result.message || 'Failed to load activity data');
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Error fetching user activity data:', error);
|
||||
showModalError('Network error or failed to fetch activity data');
|
||||
} finally {
|
||||
modalLoadingSpinner.classList.add('d-none');
|
||||
}
|
||||
}
|
||||
|
||||
// Function to populate user activity table
|
||||
function populateUserActivityTable(activities) {
|
||||
if (activities.length === 0) {
|
||||
userActivityBody.innerHTML = '<tr><td colspan="4" class="text-center text-muted">No activity found for the selected date range.</td></tr>';
|
||||
return;
|
||||
}
|
||||
|
||||
activities.forEach(activity => {
|
||||
const row = document.createElement('tr');
|
||||
|
||||
// Format date and times for display
|
||||
const date = new Date(activity.date).toLocaleDateString();
|
||||
const startTime = new Date(activity.start_time).toLocaleTimeString([], {
|
||||
hour: '2-digit',
|
||||
minute: '2-digit',
|
||||
hour12: true
|
||||
});
|
||||
const endTime = new Date(activity.end_time).toLocaleTimeString([], {
|
||||
hour: '2-digit',
|
||||
minute: '2-digit',
|
||||
hour12: true
|
||||
});
|
||||
|
||||
row.innerHTML = `
|
||||
<td>${date}</td>
|
||||
<td>${startTime}</td>
|
||||
<td>${endTime}</td>
|
||||
<td>${activity.duration_hours.toFixed(2)}</td>
|
||||
`;
|
||||
userActivityBody.appendChild(row);
|
||||
});
|
||||
}
|
||||
|
||||
// Function to show modal error message
|
||||
function showModalError(message) {
|
||||
modalErrorMessage.textContent = `Error: ${message}`;
|
||||
modalErrorMessage.classList.remove('d-none');
|
||||
userActivityBody.innerHTML = '<tr><td colspan="4" class="text-center text-danger">Failed to load activity data.</td></tr>';
|
||||
}
|
||||
|
||||
// Add event listeners to filter buttons
|
||||
filterButtons.forEach(button => {
|
||||
button.addEventListener('click', function() {
|
||||
// Update active button state
|
||||
filterButtons.forEach(btn => btn.classList.remove('active'));
|
||||
this.classList.add('active');
|
||||
currentPeriod = this.getAttribute('data-period');
|
||||
|
||||
// Show/hide week day selector based on period
|
||||
if (currentPeriod === 'weekly') {
|
||||
weekDaySelector.classList.remove('d-none');
|
||||
dateNavControls.classList.add('d-none');
|
||||
populateWeekDaySelector();
|
||||
} else if (currentPeriod === 'daily') {
|
||||
weekDaySelector.classList.add('d-none');
|
||||
dateNavControls.classList.remove('d-none');
|
||||
updateDateDisplay();
|
||||
} else {
|
||||
weekDaySelector.classList.add('d-none');
|
||||
dateNavControls.classList.add('d-none');
|
||||
}
|
||||
|
||||
loadReportData(); // Reload data when period changes
|
||||
});
|
||||
});
|
||||
|
||||
// Add event listener for user filter input (typing with debounce)
|
||||
userFilterInput.addEventListener('input', function() {
|
||||
// Clear any existing timeout
|
||||
if (userFilterTimeout) {
|
||||
clearTimeout(userFilterTimeout);
|
||||
}
|
||||
|
||||
// Set a new timeout to delay the filter application (300ms debounce)
|
||||
userFilterTimeout = setTimeout(function() {
|
||||
loadReportData();
|
||||
}, 300);
|
||||
});
|
||||
|
||||
// Add event listener for clear user filter button
|
||||
clearUserFilterBtn.addEventListener('click', function() {
|
||||
userFilterInput.value = '';
|
||||
loadReportData();
|
||||
});
|
||||
|
||||
// Add event listener for Apply Date Range button
|
||||
applyDateRangeBtn.addEventListener('click', function() {
|
||||
const username = modalUsername.textContent;
|
||||
loadUserActivityData(username);
|
||||
});
|
||||
|
||||
// Add event listener for week day selector
|
||||
weekDaySelect.addEventListener('change', function() {
|
||||
selectedWeekDay = this.value;
|
||||
loadReportData(); // Reload data when day selection changes
|
||||
});
|
||||
|
||||
// Initialize the dashboard
|
||||
function initializeDashboard() {
|
||||
// Initialize date navigation elements
|
||||
const dateNavControls = document.getElementById('dateNavControls');
|
||||
const prevDateBtn = document.getElementById('prevDateBtn');
|
||||
const nextDateBtn = document.getElementById('nextDateBtn');
|
||||
const calendarBtn = document.getElementById('calendarBtn');
|
||||
const dateSelector = document.getElementById('dateSelector');
|
||||
|
||||
// Set initial value for date selector
|
||||
dateSelector.value = selectedDate;
|
||||
|
||||
// Update the date display
|
||||
updateDateDisplay();
|
||||
|
||||
// Show date navigation controls if daily view is active
|
||||
if (currentPeriod === 'daily') {
|
||||
dateNavControls.classList.remove('d-none');
|
||||
}
|
||||
|
||||
// Add event listeners for date navigation
|
||||
prevDateBtn.addEventListener('click', goToPreviousDay);
|
||||
nextDateBtn.addEventListener('click', goToNextDay);
|
||||
calendarBtn.addEventListener('click', toggleCalendar);
|
||||
dateSelector.addEventListener('change', handleDateSelection);
|
||||
|
||||
// Load initial report data
|
||||
loadReportData();
|
||||
}
|
||||
|
||||
// Initial load
|
||||
initializeDashboard();
|
||||
});
|
||||
171
templates/dashboard.html
Normal file
171
templates/dashboard.html
Normal file
@ -0,0 +1,171 @@
|
||||
<!DOCTYPE html>
|
||||
<html lang="en">
|
||||
<head>
|
||||
<meta charset="UTF-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1.0">
|
||||
<title>Employee Activity Dashboard</title>
|
||||
<!-- Bootstrap CSS -->
|
||||
<link href="https://cdn.jsdelivr.net/npm/bootstrap@5.3.0/dist/css/bootstrap.min.css" rel="stylesheet">
|
||||
<!-- Bootstrap Icons -->
|
||||
<link rel="stylesheet" href="https://cdn.jsdelivr.net/npm/bootstrap-icons@1.10.0/font/bootstrap-icons.css">
|
||||
<!-- Custom CSS -->
|
||||
<link rel="stylesheet" href="{{ url_for('static', filename='css/dashboard.css') }}">
|
||||
</head>
|
||||
<body>
|
||||
<div class="container">
|
||||
<header class="mb-4">
|
||||
<h1 class="text-center my-4">Employee Working Time Report</h1>
|
||||
<p class="text-center text-muted">Aggregated working hours by user</p>
|
||||
</header>
|
||||
|
||||
<div class="row">
|
||||
<!-- Time Filter & User Filter -->
|
||||
<div class="col-12 mb-3">
|
||||
<div class="card">
|
||||
<div class="card-header">Filters</div>
|
||||
<div class="card-body d-flex justify-content-between align-items-center">
|
||||
<div>
|
||||
<span class="me-2 fw-bold">Time Period:</span>
|
||||
<div class="btn-group">
|
||||
<button class="btn btn-outline-primary btn-sm btn-filter active" data-period="daily">Today</button>
|
||||
<button class="btn btn-outline-primary btn-sm btn-filter" data-period="weekly">This Week</button>
|
||||
<button class="btn btn-outline-primary btn-sm btn-filter" data-period="monthly">This Month</button>
|
||||
</div>
|
||||
|
||||
<!-- Date navigation controls (initially hidden) -->
|
||||
<div id="dateNavControls" class="mt-2 d-none">
|
||||
<div class="d-flex align-items-center">
|
||||
<button id="prevDateBtn" class="btn btn-sm btn-outline-secondary me-2">
|
||||
<i class="bi bi-chevron-left"></i>
|
||||
</button>
|
||||
<span id="currentDateDisplay" class="me-2"></span>
|
||||
<button id="nextDateBtn" class="btn btn-sm btn-outline-secondary me-2">
|
||||
<i class="bi bi-chevron-right"></i>
|
||||
</button>
|
||||
<button id="calendarBtn" class="btn btn-sm btn-outline-secondary" title="Select specific date">
|
||||
<i class="bi bi-calendar3"></i>
|
||||
</button>
|
||||
<input type="date" id="dateSelector" class="form-control form-control-sm ms-2" style="width: auto; display: none;">
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Day selector for weekly view (initially hidden) -->
|
||||
<div id="weekDaySelector" class="mt-2 d-none">
|
||||
<span class="me-2">Select day:</span>
|
||||
<select id="weekDaySelect" class="form-select form-select-sm" style="width: auto; display: inline-block;">
|
||||
<!-- Options will be populated by JavaScript -->
|
||||
</select>
|
||||
</div>
|
||||
</div>
|
||||
<div class="d-flex align-items-center">
|
||||
<label for="userFilter" class="form-label me-2 mb-0 fw-bold">User:</label>
|
||||
<div class="input-group" style="width: 220px;">
|
||||
<input type="text" id="userFilter" class="form-control form-control-sm" placeholder="Filter by username...">
|
||||
<button class="btn btn-outline-secondary btn-sm" type="button" id="clearUserFilter">
|
||||
<i class="bi bi-x"></i> Clear
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="row">
|
||||
<!-- Working Time Report Table -->
|
||||
<div class="col-12">
|
||||
<div class="card">
|
||||
<div class="card-header d-flex justify-content-between align-items-center">
|
||||
<span>Working Time Report</span>
|
||||
<div id="loadingSpinner" class="spinner-border text-primary d-none" role="status">
|
||||
<span class="visually-hidden">Loading...</span>
|
||||
</div>
|
||||
</div>
|
||||
<div class="card-body">
|
||||
<div class="table-responsive">
|
||||
<table class="table table-striped table-hover" id="reportTable">
|
||||
<thead>
|
||||
<tr>
|
||||
<th>User</th>
|
||||
<th id="periodHeader">Day</th> <!-- Will be updated by JS -->
|
||||
<th>Duration (Hours)</th>
|
||||
<th>First Login Time</th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody id="reportBody">
|
||||
<!-- Report data will be populated via JavaScript -->
|
||||
<tr>
|
||||
<td colspan="4" class="text-center text-muted">Select filters and load data.</td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
</div>
|
||||
<div id="errorMessage" class="alert alert-danger d-none mt-3" role="alert">
|
||||
Error loading report data. Please try again.
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- User Activity Details Modal -->
|
||||
<div class="modal fade" id="userActivityModal" tabindex="-1" aria-labelledby="userActivityModalLabel" aria-hidden="true">
|
||||
<div class="modal-dialog modal-lg">
|
||||
<div class="modal-content">
|
||||
<div class="modal-header">
|
||||
<h5 class="modal-title" id="userActivityModalLabel">User Activity Details</h5>
|
||||
<button type="button" class="btn-close" data-bs-dismiss="modal" aria-label="Close"></button>
|
||||
</div>
|
||||
<div class="modal-body">
|
||||
<div class="d-flex justify-content-between mb-3">
|
||||
<div>
|
||||
<strong>User:</strong> <span id="modalUsername"></span>
|
||||
</div>
|
||||
<div>
|
||||
<div class="input-group input-group-sm">
|
||||
<span class="input-group-text">Date Range</span>
|
||||
<input type="date" id="startDate" class="form-control">
|
||||
<span class="input-group-text">to</span>
|
||||
<input type="date" id="endDate" class="form-control">
|
||||
<button class="btn btn-outline-primary" id="applyDateRange">Apply</button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
<div id="modalLoadingSpinner" class="d-flex justify-content-center py-3 d-none">
|
||||
<div class="spinner-border text-primary" role="status">
|
||||
<span class="visually-hidden">Loading...</span>
|
||||
</div>
|
||||
</div>
|
||||
<div class="table-responsive">
|
||||
<table class="table table-striped" id="userActivityTable">
|
||||
<thead>
|
||||
<tr>
|
||||
<th>Date</th>
|
||||
<th>Start Time</th>
|
||||
<th>End Time</th>
|
||||
<th>Duration (Hours)</th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody id="userActivityBody">
|
||||
<!-- Activity data will be populated here -->
|
||||
</tbody>
|
||||
</table>
|
||||
</div>
|
||||
<div id="modalErrorMessage" class="alert alert-danger d-none mt-3" role="alert">
|
||||
Error loading activity data.
|
||||
</div>
|
||||
</div>
|
||||
<div class="modal-footer">
|
||||
<button type="button" class="btn btn-secondary" data-bs-dismiss="modal">Close</button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Bootstrap JS -->
|
||||
<script src="https://cdn.jsdelivr.net/npm/bootstrap@5.3.0/dist/js/bootstrap.bundle.min.js"></script>
|
||||
<!-- Custom JS -->
|
||||
<script src="{{ url_for('static', filename='js/dashboard.js') }}"></script>
|
||||
</body>
|
||||
</html>
|
||||
276
workflow_state.md
Normal file
276
workflow_state.md
Normal file
@ -0,0 +1,276 @@
|
||||
# Workflow State & Rules (STM + Rules + Log)
|
||||
|
||||
*This file contains the dynamic state, embedded rules, active plan, and log for the current session.*
|
||||
*It is read and updated frequently by the AI during its operational loop.*
|
||||
|
||||
---
|
||||
|
||||
## State
|
||||
|
||||
*Holds the current status of the workflow.*
|
||||
|
||||
```yaml
|
||||
Phase: CONSTRUCT # Current workflow phase (ANALYZE, BLUEPRINT, CONSTRUCT, VALIDATE, BLUEPRINT_REVISE)
|
||||
Status: COMPLETED # Current status (READY, IN_PROGRESS, BLOCKED_*, NEEDS_*, COMPLETED, COMPLETED_ITERATION)
|
||||
CurrentTaskID: RefactorCodebase # Identifier for the main task being worked on
|
||||
CurrentStep: null # Identifier for the specific step in the plan being executed
|
||||
CurrentItem: null # Identifier for the item currently being processed in iteration
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Plan
|
||||
|
||||
*Contains the step-by-step implementation plan generated during the BLUEPRINT phase.*
|
||||
|
||||
**Task: ModifyReportingDashboard**
|
||||
*(Completed Task)*
|
||||
*Modify the reporting and dashboard to show aggregated active working time duration in simple tables.*
|
||||
|
||||
* `[✓] Step Mod-1: Define new SQL queries in app.py to calculate daily, weekly, and monthly working durations per user using LEAD() and JULIANDAY(), aggregating with SUM().`
|
||||
* `[✓] Step Mod-2: Update Flask endpoints in app.py:`
|
||||
* `[✓] Step Mod-2.1: Modify /api/reports/daily to use the new daily duration query.`
|
||||
* `[✓] Step Mod-2.2: Create /api/reports/weekly using a new weekly duration query.`
|
||||
* `[✓] Step Mod-2.3: Create /api/reports/monthly using a new monthly duration query.`
|
||||
* `[✓] Step Mod-2.4: Ensure endpoints return JSON data formatted for table display (e.g., list of dicts with user, period, duration_hours).`
|
||||
* `[✓] Step Mod-3: Update templates/dashboard.html:`
|
||||
* `[✓] Step Mod-3.1: Remove Chart.js script inclusion and chart-related HTML elements.`
|
||||
* `[✓] Step Mod-3.2: Add JavaScript to fetch data from the new/updated API endpoints.`
|
||||
* `[✓] Step Mod-3.3: Create HTML tables to display the fetched duration data (User, Period, Duration).`
|
||||
|
||||
**Task: AddExtensiveLogging**
|
||||
*(Completed Task)*
|
||||
*Add extensive file-based logging to both the Flask server and the PowerShell client.*
|
||||
|
||||
* `[✓] Step Log-1: Configure Flask Logging (`app.py`):`
|
||||
* `[✓] Step Log-1.1: Import `logging` and `logging.handlers`.`
|
||||
* `[✓] Step Log-1.2: In `create_app` or equivalent setup location: Configure a `RotatingFileHandler` to write to `instance/server.log` (ensure `instance` folder exists). Set log level (e.g., INFO). Define a log format (e.g., `%(asctime)s - %(levelname)s - %(message)s`). Add the handler to `app.logger`.`
|
||||
* `[✓] Step Log-2: Add Logging Statements to Flask App (`app.py`):`
|
||||
* `[✓] Step Log-2.1: Log application startup.`
|
||||
* `[✓] Step Log-2.2: Log incoming requests to `/api/report` with payload details.`
|
||||
* `[✓] Step Log-2.3: Log database event creation attempts and success/failure.`
|
||||
* `[✓] Step Log-2.4: Log incoming requests to dashboard/report endpoints.`
|
||||
* `[✓] Step Log-2.5: Log report data fetching and generation.`
|
||||
* `[✓] Step Log-2.6: Log errors caught by global error handlers.`
|
||||
* `[✓] Step Log-3: Implement PowerShell Logging (`report.ps1`):`
|
||||
* `[✓] Step Log-3.1: Define a log file path (e.g., `$env:TEMP\user_work_tracking_client.log`).`
|
||||
* `[✓] Step Log-3.2: Create a helper function `Write-Log($Message)` that prepends a timestamp and appends the message to the log file using `Out-File -Append`.`
|
||||
* `[✓] Step Log-4: Add Logging Statements to PowerShell Script (`report.ps1`):`
|
||||
* `[✓] Step Log-4.1: Log script start and end.`
|
||||
* `[✓] Step Log-4.2: Log API URL being used.`
|
||||
* `[✓] Step Log-4.3: Log results of idle time check.`
|
||||
* `[✓] Step Log-4.4: Log detected state (working/stopped) and any state changes.`
|
||||
* `[✓] Step Log-4.5: Log the payload being sent to the API.`
|
||||
* `[✓] Step Log-4.6: Log the attempt to call the API.`
|
||||
* `[✓] Step Log-4.7: Log API call success response.`
|
||||
* `[✓] Step Log-4.8: Log API call failures within the `catch` block.`
|
||||
|
||||
**Task: FixReportFunctionError**
|
||||
*(Completed Task)*
|
||||
*Implement the missing `fetch_duration_report` and `format_report_data` functions in `app.py`.*
|
||||
|
||||
* `[✓] Step Fix-1: Define `fetch_duration_report` function in `app.py`:`
|
||||
* `[✓] Step Fix-1.1: Use `calculate_duration_sql(time_period)` to get the base SQL query.`
|
||||
* `[✓] Step Fix-1.2: Initialize `params = {}`.`
|
||||
* `[✓] Step Fix-1.3: If `user_filter` is provided, modify the SQL using `filter_sql_by_user(sql_query, user_filter)` and add `user_filter` to `params`.`
|
||||
* `[✓] Step Fix-1.4: Execute the query: `results = db.session.execute(text(sql_query), params).mappings().all()`.
|
||||
* `[✓] Step Fix-1.5: Return the `results`.`
|
||||
* `[✓] Step Fix-2: Define `format_report_data` function in `app.py`:`
|
||||
* `[✓] Step Fix-2.1: Determine the period key based on `time_period`.`
|
||||
* `[✓] Step Fix-2.2: Iterate through the `results` (list of RowMapping objects).`
|
||||
* `[✓] Step Fix-2.3: For each row, create a dictionary containing needed data.`
|
||||
* `[✓] Step Fix-2.4: Return the list of formatted dictionaries.`
|
||||
|
||||
**Task: FixDatabaseAccessError**
|
||||
*(Completed Task)*
|
||||
*Ensure the SQLite database file is created and accessible by uncommenting the database initialization call in `app.py`.*
|
||||
|
||||
* `[✓] Step DBFix-1: Uncomment the `init_db()` call within the `if __name__ == '__main__':` block in `app.py` to ensure database tables are created on script execution if they don't exist.`
|
||||
|
||||
**Task: DiagnoseDatabaseAccessError**
|
||||
*(Completed Task)*
|
||||
*Investigate and resolve the persistent `sqlite3.OperationalError: unable to open database file` error during `init_db()`.*
|
||||
|
||||
* `[✓] Step DiagDB-1: Add diagnostic logging in `app.py` within the `if __name__ == '__main__':` block, just before the `init_db()` call:`
|
||||
* `[✓] Step DiagDB-1.1: Log the resolved `app.instance_path`.`
|
||||
* `[✓] Step DiagDB-1.2: Log whether `app.instance_path` exists using `os.path.exists`.`
|
||||
* `[✓] Step DiagDB-1.3: Log whether the process has write access to `app.instance_path` using `os.access(app.instance_path, os.W_OK)`.`
|
||||
* `[✓] Step DiagDB-2: Analyze the output from the diagnostic logging after running the script again.`
|
||||
* `[✓] Step DiagDB-3: (Conditional - Based on DiagDB-2) Fixed database access issue by changing the SQLite database URI from a relative path ('sqlite:///instance/work_events.db') to an absolute path using os.path.join(app.instance_path, 'work_events.db').`
|
||||
|
||||
**Task: UpdateSampleData**
|
||||
*(Completed Task)*
|
||||
*Update the SQL sample data and queries to support the user's requirements for tracking work time durations.*
|
||||
|
||||
* `[✓] Step UpdateSQL-1: Updated create_db.sql to add sample data for Robert, Ilia, and Nika with working and stopped events across today, this week, and this month periods.`
|
||||
* `[✓] Step UpdateSQL-2: Added enhanced SQL queries for calculating working durations with first login time for daily, weekly, and monthly reports.`
|
||||
* `[✓] Step UpdateSQL-3: Added detailed SQL queries for filtering by specific time periods (today, this week, this month) and for tracking individual user work sessions with start/end times.`
|
||||
|
||||
**Task: UpdateFrontend**
|
||||
*(Completed Task)*
|
||||
*Update the frontend dashboard to show First Login Time column and add functionality to view detailed user work action logs.*
|
||||
|
||||
* `[✓] Step Frontend-1: Updated the backend in app.py to include first_login_time field in the report data:`
|
||||
* `[✓] Step Frontend-1.1: Modified calculate_duration_sql() to include first_login_time in the SQL query results.`
|
||||
* `[✓] Step Frontend-1.2: Updated format_report_data() to include first_login_time in the formatted output.`
|
||||
* `[✓] Step Frontend-1.3: Added fetch_user_activity() function to retrieve detailed user activity logs.`
|
||||
* `[✓] Step Frontend-1.4: Added format_user_activity() function to format detailed activity data.`
|
||||
* `[✓] Step Frontend-2: Added a new endpoint /api/user-activity/<username> for detailed user logs:`
|
||||
* `[✓] Step Frontend-2.1: Created endpoint with support for date range filtering.`
|
||||
* `[✓] Step Frontend-2.2: Added proper error handling and logging.`
|
||||
* `[✓] Step Frontend-3: Updated the dashboard.html template:`
|
||||
* `[✓] Step Frontend-3.1: Added a "First Login Time" column to the main report table.`
|
||||
* `[✓] Step Frontend-3.2: Made usernames clickable to view detailed activity.`
|
||||
* `[✓] Step Frontend-3.3: Added a modal dialog to display detailed activity.`
|
||||
* `[✓] Step Frontend-3.4: Added date range selection for filtering detailed activity.`
|
||||
* `[✓] Step Frontend-3.5: Added JavaScript to fetch and display the detailed activity data.`
|
||||
|
||||
**Task: FixPostgreSQLUserDisplay**
|
||||
*(Completed Task)*
|
||||
*Fix the issue where the dashboard was showing database login username instead of actual user data.*
|
||||
|
||||
* `[✓] Step PGSQL-1: Added diagnostic logging to identify the source of the PostgreSQL username issue:`
|
||||
* `[✓] Step PGSQL-1.1: Added debug logging in report API endpoints to log raw and formatted usernames.`
|
||||
* `[✓] Step PGSQL-1.2: Added direct database queries to verify actual data in the database.`
|
||||
* `[✓] Step PGSQL-1.3: Detected inconsistency between direct database queries and application queries.`
|
||||
* `[✓] Step PGSQL-2: Fixed SQL query issues in calculate_duration_sql() function:`
|
||||
* `[✓] Step PGSQL-2.1: Added proper quoting for "user" column to avoid PostgreSQL reserved keyword issues.`
|
||||
* `[✓] Step PGSQL-2.2: Added explicit schema reference to table name (public.work_events).`
|
||||
* `[✓] Step PGSQL-2.3: Verified SQL queries properly reference "user" column with quotes.`
|
||||
* `[✓] Step PGSQL-3: Updated documentation in README.md to include PostgreSQL-specific troubleshooting:`
|
||||
* `[✓] Step PGSQL-3.1: Added notes about PostgreSQL reserved keywords.`
|
||||
* `[✓] Step PGSQL-3.2: Added guidance on schema specification in SQL queries.`
|
||||
* `[✓] Step PGSQL-3.3: Added explanation about database username vs. data in "user" column.`
|
||||
|
||||
**Task: FixFrontendIssues**
|
||||
*(Completed Task)*
|
||||
*Fix various frontend display issues for better usability and data presentation.*
|
||||
|
||||
* `[✓] Step FE-1: Format Day column to dd/mm/yyyy format`
|
||||
* `[✓] Step FE-1.1: Modified populateTable() function in dashboard.html to format date in periodValue.`
|
||||
* `[✓] Step FE-1.2: Add conditional date formatting that converts ISO dates (YYYY-MM-DD) to DD/MM/YYYY format.`
|
||||
* `[✓] Step FE-2: Add sub-menu for "This Week" to select specific days`
|
||||
* `[✓] Step FE-2.1: Add a dropdown menu/selector that only appears when "This Week" is selected.`
|
||||
* `[✓] Step FE-2.2: Populate dropdown with days of the current week (Monday to Sunday).`
|
||||
* `[✓] Step FE-2.3: Create a new API endpoint in app.py for filtering by specific day within the weekly view.`
|
||||
* `[✓] Step FE-2.4: Update the fetch mechanism to use the new endpoint when a specific day is selected.`
|
||||
* `[✓] Step FE-3: Fix positioning and alignment issues`
|
||||
* `[✓] Step FE-3.1: Center-align the Duration column in the CSS.`
|
||||
* `[✓] Step FE-3.2: Center-align the First Login Time column in Today view.`
|
||||
* `[✓] Step FE-3.3: Ensure consistent padding and alignment across all table columns.`
|
||||
* `[✓] Step FE-4: Remove First Login Time from This Month tab`
|
||||
* `[✓] Step FE-4.1: Modify the populateTable() function to conditionally hide the First Login Time column when in monthly view.`
|
||||
* `[✓] Step FE-4.2: Add logic to adjust column widths when First Login Time is hidden.`
|
||||
* `[✓] Step FE-5: Aggregate time worked by users`
|
||||
* `[✓] Step FE-5.1: Modify backend SQL queries in app.py to properly aggregate work hours per user.`
|
||||
* `[✓] Step FE-5.2: Update the calculate_duration_sql() function to ensure records are fully grouped by user.`
|
||||
* `[✓] Step FE-5.3: Add a preprocessing step in format_report_data() to consolidate any remaining duplicate user entries.`
|
||||
* `[✓] Step FE-5.4: Update the frontend table rendering to handle consolidated user data properly.`
|
||||
|
||||
**Task: FixTodayTabFiltering**
|
||||
*(Completed Task)*
|
||||
*Fix issue where Today tab shows entries from different dates instead of just the current day.*
|
||||
|
||||
* `[✓] Step FTF-1: Modify the get_daily_report endpoint in app.py to filter for today's date only.`
|
||||
* `[✓] Step FTF-2: Add date filtering to include only records from the current date.`
|
||||
* `[✓] Step FTF-3: Update logging to reflect the new filtered results.`
|
||||
|
||||
**Task: FixWeeklyMonthlyAggregation**
|
||||
*(Completed Task)*
|
||||
*Fix weekly and monthly tabs to properly aggregate entries by user to prevent duplicate user rows.*
|
||||
|
||||
* `[✓] Step FWMA-1: Modify the get_weekly_report endpoint to aggregate entries by user within the same week:`
|
||||
* `[✓] Step FWMA-1.1: Add filtering to only include current week.`
|
||||
* `[✓] Step FWMA-1.2: Implement user-based aggregation for total hours within the same week.`
|
||||
* `[✓] Step FWMA-1.3: Maintain the earliest first login time when aggregating.`
|
||||
* `[✓] Step FWMA-2: Modify the get_monthly_report endpoint to aggregate entries by user within the same month:`
|
||||
* `[✓] Step FWMA-2.1: Add filtering to only include current month.`
|
||||
* `[✓] Step FWMA-2.2: Implement user-based aggregation for total hours within the same month.`
|
||||
|
||||
**Task: AddDateNavigation**
|
||||
*Add navigation arrows and a calendar component to the dashboard for selecting specific dates.*
|
||||
|
||||
* `[✓] Step DateNav-1: Modify the backend API:`
|
||||
* `[✓] Step DateNav-1.1: Update get_daily_report endpoint in app.py to accept a specific date parameter.`
|
||||
* `[✓] Step DateNav-1.2: Implement date parameter handling to filter results by the selected date.`
|
||||
* `[✓] Step DateNav-1.3: Add appropriate logging for the date parameter.`
|
||||
* `[✓] Step DateNav-2: Add navigation components to dashboard.html:`
|
||||
* `[✓] Step DateNav-2.1: Add left and right arrow buttons next to the Today filter button.`
|
||||
* `[✓] Step DateNav-2.2: Add a date picker/calendar icon button that opens a calendar component.`
|
||||
* `[✓] Step DateNav-2.3: Style the new navigation components to match the existing design.`
|
||||
* `[✓] Step DateNav-3: Implement calendar component functionality:`
|
||||
* `[✓] Step DateNav-3.1: Add an HTML date input or a Bootstrap datepicker component for selecting dates.`
|
||||
* `[✓] Step DateNav-3.2: Implement JavaScript function to set the current selected date.`
|
||||
* `[✓] Step DateNav-3.3: Add event handlers to update the display when a date is selected.`
|
||||
* `[✓] Step DateNav-4: Implement arrow navigation functionality:`
|
||||
* `[✓] Step DateNav-4.1: Add click handlers for the arrow buttons to move forward/backward by one day.`
|
||||
* `[✓] Step DateNav-4.2: Update the UI to reflect the selected date.`
|
||||
* `[✓] Step DateNav-4.3: Implement the fetch function to reload data when a new date is selected.`
|
||||
* `[✓] Step DateNav-5: Update the date display and state management:`
|
||||
* `[✓] Step DateNav-5.1: Add a visual indicator of the currently selected date.`
|
||||
* `[✓] Step DateNav-5.2: Add state management to track the currently selected date.`
|
||||
* `[✓] Step DateNav-5.3: Update the table header to show the selected date when in daily view.`
|
||||
|
||||
**Task: RefactorCodebase**
|
||||
*Refactor the codebase by splitting dashboard.html into separate components and breaking app.py into logical modules.*
|
||||
|
||||
*Front-end Refactoring:*
|
||||
* `[✓] Step FE-Refactor-1: Create static file structure:`
|
||||
* `[✓] Step FE-Refactor-1.1: Create static/css directory.`
|
||||
* `[✓] Step FE-Refactor-1.2: Create static/js directory.`
|
||||
* `[✓] Step FE-Refactor-2: Split dashboard.html:`
|
||||
* `[✓] Step FE-Refactor-2.1: Extract CSS to static/css/dashboard.css.`
|
||||
* `[✓] Step FE-Refactor-2.2: Extract JavaScript to static/js/dashboard.js.`
|
||||
* `[✓] Step FE-Refactor-2.3: Update HTML to reference external files.`
|
||||
|
||||
*Back-end Refactoring:*
|
||||
* `[✓] Step BE-Refactor-1: Create package structure:`
|
||||
* `[✓] Step BE-Refactor-1.1: Create app directory and subdirectories.`
|
||||
* `[✓] Step BE-Refactor-1.2: Create __init__.py files for all packages.`
|
||||
* `[✓] Step BE-Refactor-2: Implement application factory pattern:`
|
||||
* `[✓] Step BE-Refactor-2.1: Create app/__init__.py with create_app function.`
|
||||
* `[✓] Step BE-Refactor-2.2: Move configuration logic to app/__init__.py.`
|
||||
* `[✓] Step BE-Refactor-3: Extract database components:`
|
||||
* `[✓] Step BE-Refactor-3.1: Create app/models.py with WorkEvent model.`
|
||||
* `[✓] Step BE-Refactor-3.2: Initialize SQLAlchemy in app/__init__.py.`
|
||||
* `[✓] Step BE-Refactor-4: Extract API endpoints:`
|
||||
* `[✓] Step BE-Refactor-4.1: Create app/api/events.py with report_event endpoint.`
|
||||
* `[✓] Step BE-Refactor-4.2: Create app/api/reports.py with report endpoints.`
|
||||
* `[✓] Step BE-Refactor-5: Extract utility functions:`
|
||||
* `[✓] Step BE-Refactor-5.1: Create app/utils/queries.py with SQL query functions.`
|
||||
* `[✓] Step BE-Refactor-5.2: Create app/utils/formatting.py with data formatting functions.`
|
||||
* `[✓] Step BE-Refactor-6: Extract web routes:`
|
||||
* `[✓] Step BE-Refactor-6.1: Create app/views/dashboard.py with dashboard route.`
|
||||
* `[✓] Step BE-Refactor-7: Extract error handlers:`
|
||||
* `[✓] Step BE-Refactor-7.1: Create app/errors.py with error handler functions.`
|
||||
* `[✓] Step BE-Refactor-8: Create new entry point:`
|
||||
* `[✓] Step BE-Refactor-8.1: Create a new run.py file as the application entry point.`
|
||||
* `[✓] Step BE-Refactor-8.2: Update any references to the old app.py.`
|
||||
|
||||
## Log
|
||||
|
||||
*A chronological log of significant actions, events, tool outputs, and decisions.*
|
||||
*(This section will be populated by the AI during operation)*
|
||||
|
||||
*Actual Log:*
|
||||
* `[2025-05-07 10:30:00] Completed FixWeeklyMonthlyAggregation task: Modified weekly and monthly reports to aggregate entries by user.`
|
||||
* `[2025-05-07 10:15:00] Completed FixTodayTabFiltering task: Modified get_daily_report to filter by current date only.`
|
||||
* `[2025-05-07 11:00:00] Started new task AddDateNavigation: Creating plan for adding date navigation arrows and calendar component to dashboard.`
|
||||
* `[2025-05-07 11:15:00] Plan for AddDateNavigation approved. Starting implementation with backend API modifications.`
|
||||
* `[2025-05-07 11:45:00] Completed AddDateNavigation task: Added date navigation with arrow buttons and calendar component for date selection.`
|
||||
* `[2025-05-08 09:00:00] Started new task RefactorCodebase: Creating plan for refactoring dashboard.html and app.py into modular components.`
|
||||
* `[2025-05-08 09:30:00] Created plan for RefactorCodebase: Split frontend into HTML/CSS/JS and backend into modular packages.`
|
||||
* `[2025-05-08 09:45:00] Plan for RefactorCodebase approved. Beginning implementation with frontend static file structure.`
|
||||
* `[2025-05-08 10:00:00] Completed Step FE-Refactor-1: Created static file structure with css and js directories.`
|
||||
* `[2025-05-08 10:30:00] Completed Step FE-Refactor-2: Split dashboard.html into separate HTML, CSS, and JavaScript files.`
|
||||
* `[2025-05-08 11:00:00] Completed Step BE-Refactor-1: Created application package structure with __init__.py files.`
|
||||
* `[2025-05-08 11:30:00] Completed Step BE-Refactor-2: Implemented application factory pattern in app/__init__.py.`
|
||||
* `[2025-05-08 12:00:00] Completed Step BE-Refactor-3: Extracted WorkEvent model to app/models.py.`
|
||||
* `[2025-05-08 12:30:00] Completed Step BE-Refactor-4.1: Created events.py with report_event endpoint.`
|
||||
* `[2025-05-08 13:00:00] Completed Step BE-Refactor-4.2: Created reports.py with all report endpoints.`
|
||||
* `[2025-05-08 13:30:00] Completed Step BE-Refactor-5: Extracted utility functions to queries.py and formatting.py.`
|
||||
* `[2025-05-08 14:00:00] Completed Step BE-Refactor-6: Created dashboard.py with web routes.`
|
||||
* `[2025-05-08 14:30:00] Completed Step BE-Refactor-7: Created errors.py with error handler functions.`
|
||||
* `[2025-05-08 15:00:00] Completed Step BE-Refactor-8.1: Created run.py as the new application entry point.`
|
||||
* `[2025-05-08 15:30:00] Completed Step BE-Refactor-8.2: Updated README.md with new application structure references.`
|
||||
* `[2025-05-08 16:00:00] Completed RefactorCodebase task: Successfully refactored the application into a modular structure.`
|
||||
Loading…
x
Reference in New Issue
Block a user