mirror of
https://github.com/FoggedLens/deflock.git
synced 2026-02-12 15:02:45 +00:00
Add Blog (#83)
* add rss ingestor * it works * blog working * fix the theme switcher * finalize blog, re-add store * fix terraform for blog_scraper * update sitemap * update readme
This commit is contained in:
1
.gitignore
vendored
1
.gitignore
vendored
@@ -43,6 +43,7 @@ terraform.tfstate*
|
|||||||
|
|
||||||
# Lambda Python Stuff
|
# Lambda Python Stuff
|
||||||
serverless/*/lambda.zip
|
serverless/*/lambda.zip
|
||||||
|
serverless/*/.build/
|
||||||
serverless/*/src/*
|
serverless/*/src/*
|
||||||
!serverless/*/src/alpr_cache.py
|
!serverless/*/src/alpr_cache.py
|
||||||
!serverless/*/src/alpr_clusters.py
|
!serverless/*/src/alpr_clusters.py
|
||||||
|
|||||||
1
serverless/blog_scraper/.python-version
Normal file
1
serverless/blog_scraper/.python-version
Normal file
@@ -0,0 +1 @@
|
|||||||
|
3.14
|
||||||
192
serverless/blog_scraper/README.md
Normal file
192
serverless/blog_scraper/README.md
Normal file
@@ -0,0 +1,192 @@
|
|||||||
|
# Blog RSS Scraper
|
||||||
|
|
||||||
|
This Lambda function ingests RSS feeds into a Directus CMS instance. It's specifically configured to pull from the "Have I Been Flocked?" RSS feed and sync the posts with your Directus blog collection.
|
||||||
|
|
||||||
|
## Features
|
||||||
|
|
||||||
|
- **RSS Feed Parsing**: Extracts title, link, pubDate, and description from RSS entries
|
||||||
|
- **Directus Integration**: Creates, updates, and deletes blog posts via Directus API
|
||||||
|
- **Idempotent Operation**: Safe to run multiple times - only makes necessary changes
|
||||||
|
- **Selective Sync**: Only manages RSS-ingested posts (identified by `externalUrl` field)
|
||||||
|
- **Error Handling**: Comprehensive logging and error recovery
|
||||||
|
|
||||||
|
## Setup
|
||||||
|
|
||||||
|
### Environment Variables
|
||||||
|
|
||||||
|
Set the following environment variables:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Required
|
||||||
|
DIRECTUS_API_TOKEN=your_directus_api_token_here
|
||||||
|
|
||||||
|
# Optional (defaults to https://cms.deflock.me)
|
||||||
|
DIRECTUS_BASE_URL=https://your-directus-instance.com
|
||||||
|
```
|
||||||
|
|
||||||
|
### Directus Collection Schema
|
||||||
|
|
||||||
|
Your Directus `blog` collection should have the following fields:
|
||||||
|
|
||||||
|
- `id` (integer, auto-increment)
|
||||||
|
- `title` (string, required)
|
||||||
|
- `description` (text)
|
||||||
|
- `content` (rich text, optional - RSS posts will have this as null)
|
||||||
|
- `externalUrl` (string, optional - identifies RSS-ingested posts)
|
||||||
|
- `published` (datetime)
|
||||||
|
|
||||||
|
### Dependencies
|
||||||
|
|
||||||
|
Install dependencies using uv:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
uv init
|
||||||
|
```
|
||||||
|
|
||||||
|
## Usage
|
||||||
|
|
||||||
|
### Local Testing
|
||||||
|
|
||||||
|
```bash
|
||||||
|
uv run main.py
|
||||||
|
```
|
||||||
|
|
||||||
|
### AWS Lambda
|
||||||
|
|
||||||
|
Deploy as a Python 3.14 Lambda function. The `lambda_handler` function serves as the entry point.
|
||||||
|
|
||||||
|
#### Sample Lambda Event
|
||||||
|
|
||||||
|
The function doesn't require any specific event data:
|
||||||
|
|
||||||
|
```json
|
||||||
|
{}
|
||||||
|
```
|
||||||
|
|
||||||
|
#### Sample Response
|
||||||
|
|
||||||
|
Success:
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"statusCode": 200,
|
||||||
|
"body": {
|
||||||
|
"message": "RSS synchronization completed successfully",
|
||||||
|
"stats": {
|
||||||
|
"created": 2,
|
||||||
|
"updated": 1,
|
||||||
|
"deleted": 0,
|
||||||
|
"errors": 0
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
Error:
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"statusCode": 500,
|
||||||
|
"body": {
|
||||||
|
"message": "RSS synchronization failed",
|
||||||
|
"error": "DIRECTUS_API_TOKEN environment variable is required"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## How It Works
|
||||||
|
|
||||||
|
1. **Fetch RSS Feed**: Downloads and parses the RSS feed from `https://haveibeenflocked.com/feed.xml`
|
||||||
|
|
||||||
|
2. **Get Existing Posts**: Queries Directus for all blog posts that have an `externalUrl` (these are RSS-managed posts)
|
||||||
|
|
||||||
|
3. **Synchronization**:
|
||||||
|
- **Create**: New RSS entries that don't exist in Directus
|
||||||
|
- **Update**: Existing posts where title or description has changed
|
||||||
|
- **Delete**: Directus posts with `externalUrl` that no longer exist in the RSS feed
|
||||||
|
|
||||||
|
4. **Preserve Manual Posts**: Posts without an `externalUrl` are left untouched
|
||||||
|
|
||||||
|
## RSS Feed Structure
|
||||||
|
|
||||||
|
The scraper expects standard RSS 2.0 format with the following elements:
|
||||||
|
- `<title>`: Post title
|
||||||
|
- `<link>`: Post URL (becomes `externalUrl`)
|
||||||
|
- `<pubDate>`: Publication date (becomes `published`)
|
||||||
|
- `<description>` or `<content>`: Post description (HTML tags are stripped)
|
||||||
|
|
||||||
|
## Error Handling
|
||||||
|
|
||||||
|
- Invalid dates are logged as warnings but don't stop processing
|
||||||
|
- Individual post errors are logged and counted but don't stop the entire sync
|
||||||
|
- HTTP errors from Directus API are logged with full details
|
||||||
|
- Missing environment variables cause immediate failure with clear error messages
|
||||||
|
|
||||||
|
## Logging
|
||||||
|
|
||||||
|
The function uses Python's standard logging module with INFO level. Key events logged:
|
||||||
|
|
||||||
|
- RSS feed fetch status
|
||||||
|
- Number of entries parsed
|
||||||
|
- Create/update/delete operations
|
||||||
|
- Errors and warnings
|
||||||
|
- Final synchronization statistics
|
||||||
|
|
||||||
|
## Security Considerations
|
||||||
|
|
||||||
|
- Store the Directus API token securely (AWS Secrets Manager recommended for production)
|
||||||
|
- Use HTTPS for all API communications (enforced by default)
|
||||||
|
- The function only modifies posts with `externalUrl` - manual posts are safe
|
||||||
|
- Consider rate limiting if running frequently
|
||||||
|
|
||||||
|
## Deployment
|
||||||
|
|
||||||
|
### AWS Lambda Deployment Package
|
||||||
|
|
||||||
|
1. Navigate to [the terraform directory](../../terraform/).
|
||||||
|
2. Set the required variables in a local copy of `terraform.tfvars`.
|
||||||
|
3. Run `terraform apply`.
|
||||||
|
|
||||||
|
### Environment Variables in Lambda
|
||||||
|
|
||||||
|
Set in the Lambda function configuration:
|
||||||
|
- `DIRECTUS_API_TOKEN`: Your Directus API token
|
||||||
|
- `DIRECTUS_BASE_URL`: Your Directus instance URL (optional)
|
||||||
|
|
||||||
|
### Scheduling
|
||||||
|
|
||||||
|
The Terraform configutaiton sets up a CloudWatch Events rule to run this function periodically.
|
||||||
|
|
||||||
|
## Troubleshooting
|
||||||
|
|
||||||
|
### Common Issues
|
||||||
|
|
||||||
|
1. **401 Unauthorized**: Check your `DIRECTUS_API_TOKEN`
|
||||||
|
2. **404 Not Found**: Verify `DIRECTUS_BASE_URL` and collection name (`blog`)
|
||||||
|
3. **RSS Parse Errors**: Check if the RSS feed is accessible and valid
|
||||||
|
4. **Date Parse Failures**: Usually logged as warnings and don't stop processing
|
||||||
|
|
||||||
|
### Testing Connection
|
||||||
|
|
||||||
|
The function will fail fast if it can't connect to Directus, making debugging easier.
|
||||||
|
|
||||||
|
## Development
|
||||||
|
|
||||||
|
### Local Development Setup
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Clone and navigate to the blog_scraper directory
|
||||||
|
cd serverless/blog_scraper
|
||||||
|
|
||||||
|
# Install dependencies
|
||||||
|
uv init
|
||||||
|
|
||||||
|
# Set environment variables
|
||||||
|
export DIRECTUS_API_TOKEN="your_token"
|
||||||
|
export DIRECTUS_BASE_URL="https://cms.deflock.me"
|
||||||
|
|
||||||
|
# Run locally
|
||||||
|
uv run main.py
|
||||||
|
```
|
||||||
|
|
||||||
|
### Testing with Different RSS Feeds
|
||||||
|
|
||||||
|
To test with a different RSS feed, modify the `rss_url` in the `BlogScraper.__init__` method.
|
||||||
283
serverless/blog_scraper/main.py
Normal file
283
serverless/blog_scraper/main.py
Normal file
@@ -0,0 +1,283 @@
|
|||||||
|
import os
|
||||||
|
import logging
|
||||||
|
import feedparser
|
||||||
|
import requests
|
||||||
|
import json
|
||||||
|
from datetime import datetime
|
||||||
|
from dateutil import parser as date_parser
|
||||||
|
from typing import List, Dict, Optional
|
||||||
|
from urllib.parse import urlencode
|
||||||
|
|
||||||
|
# Configure logging
|
||||||
|
logging.basicConfig(level=logging.INFO)
|
||||||
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
class BlogScraper:
|
||||||
|
"""RSS feed scraper that ingests blog posts into Directus CMS"""
|
||||||
|
|
||||||
|
def __init__(self):
|
||||||
|
self.rss_url = "https://haveibeenflocked.com/feed.xml"
|
||||||
|
self.directus_base_url = os.getenv("DIRECTUS_BASE_URL", "https://cms.deflock.me")
|
||||||
|
self.directus_token = os.getenv("DIRECTUS_API_TOKEN")
|
||||||
|
|
||||||
|
if not self.directus_token:
|
||||||
|
raise ValueError("DIRECTUS_API_TOKEN environment variable is required")
|
||||||
|
|
||||||
|
self.headers = {
|
||||||
|
"Authorization": f"Bearer {self.directus_token}",
|
||||||
|
"Content-Type": "application/json",
|
||||||
|
"User-Agent": "deflock-blog-scraper/1.0"
|
||||||
|
}
|
||||||
|
|
||||||
|
def fetch_rss_feed(self) -> feedparser.FeedParserDict:
|
||||||
|
"""Fetch and parse the RSS feed"""
|
||||||
|
logger.info(f"Fetching RSS feed from {self.rss_url}")
|
||||||
|
|
||||||
|
try:
|
||||||
|
feed = feedparser.parse(self.rss_url)
|
||||||
|
if feed.bozo:
|
||||||
|
logger.warning(f"Feed parsing warning: {feed.bozo_exception}")
|
||||||
|
|
||||||
|
logger.info(f"Successfully parsed RSS feed with {len(feed.entries)} entries")
|
||||||
|
return feed
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"Error fetching RSS feed: {e}")
|
||||||
|
raise
|
||||||
|
|
||||||
|
def get_existing_posts(self) -> List[Dict]:
|
||||||
|
"""Get all existing blog posts from Directus that have external URLs"""
|
||||||
|
logger.info("Fetching existing blog posts from Directus")
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Filter for posts that have an externalUrl (RSS-ingested posts)
|
||||||
|
url = f"{self.directus_base_url}/items/blog"
|
||||||
|
|
||||||
|
# Properly format the filter as JSON and URL encode it
|
||||||
|
filter_obj = {
|
||||||
|
"externalUrl": {
|
||||||
|
"_nnull": True # not null
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
params = {
|
||||||
|
"filter": json.dumps(filter_obj),
|
||||||
|
"limit": -1 # Get all records
|
||||||
|
}
|
||||||
|
|
||||||
|
response = requests.get(url, headers=self.headers, params=params)
|
||||||
|
response.raise_for_status()
|
||||||
|
|
||||||
|
data = response.json()
|
||||||
|
posts = data.get("data", [])
|
||||||
|
logger.info(f"Found {len(posts)} existing RSS-ingested posts")
|
||||||
|
return posts
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"Error fetching existing posts: {e}")
|
||||||
|
raise
|
||||||
|
|
||||||
|
def create_blog_post(self, post_data: Dict) -> Optional[Dict]:
|
||||||
|
"""Create a new blog post in Directus"""
|
||||||
|
logger.info(f"Creating new blog post: {post_data['title']}")
|
||||||
|
|
||||||
|
try:
|
||||||
|
url = f"{self.directus_base_url}/items/blog"
|
||||||
|
|
||||||
|
response = requests.post(url, headers=self.headers, json=post_data)
|
||||||
|
|
||||||
|
if response.status_code >= 400:
|
||||||
|
logger.error(f"HTTP {response.status_code} error response body: {response.text}")
|
||||||
|
|
||||||
|
response.raise_for_status()
|
||||||
|
|
||||||
|
created_post = response.json()
|
||||||
|
logger.info(f"Successfully created blog post with ID: {created_post['data']['id']}")
|
||||||
|
return created_post["data"]
|
||||||
|
|
||||||
|
except requests.exceptions.HTTPError as e:
|
||||||
|
logger.error(f"HTTP error creating blog post '{post_data['title']}': {e}")
|
||||||
|
logger.error(f"Response content: {response.text if 'response' in locals() else 'No response available'}")
|
||||||
|
raise
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"Error creating blog post '{post_data['title']}': {e}")
|
||||||
|
raise
|
||||||
|
|
||||||
|
def update_blog_post(self, post_id: int, post_data: Dict) -> Optional[Dict]:
|
||||||
|
"""Update an existing blog post in Directus"""
|
||||||
|
logger.info(f"Updating blog post ID {post_id}: {post_data['title']}")
|
||||||
|
|
||||||
|
try:
|
||||||
|
url = f"{self.directus_base_url}/items/blog/{post_id}"
|
||||||
|
|
||||||
|
response = requests.patch(url, headers=self.headers, json=post_data)
|
||||||
|
|
||||||
|
if response.status_code >= 400:
|
||||||
|
logger.error(f"HTTP {response.status_code} error response body: {response.text}")
|
||||||
|
|
||||||
|
response.raise_for_status()
|
||||||
|
|
||||||
|
updated_post = response.json()
|
||||||
|
logger.info(f"Successfully updated blog post ID: {post_id}")
|
||||||
|
return updated_post["data"]
|
||||||
|
|
||||||
|
except requests.exceptions.HTTPError as e:
|
||||||
|
logger.error(f"HTTP error updating blog post ID {post_id}: {e}")
|
||||||
|
logger.error(f"Response content: {response.text if 'response' in locals() else 'No response available'}")
|
||||||
|
raise
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"Error updating blog post ID {post_id}: {e}")
|
||||||
|
raise
|
||||||
|
|
||||||
|
def delete_blog_post(self, post_id: int) -> None:
|
||||||
|
"""Delete a blog post from Directus"""
|
||||||
|
logger.info(f"Deleting blog post ID {post_id}")
|
||||||
|
|
||||||
|
try:
|
||||||
|
url = f"{self.directus_base_url}/items/blog/{post_id}"
|
||||||
|
response = requests.delete(url, headers=self.headers)
|
||||||
|
response.raise_for_status()
|
||||||
|
|
||||||
|
logger.info(f"Successfully deleted blog post ID: {post_id}")
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"Error deleting blog post ID {post_id}: {e}")
|
||||||
|
raise
|
||||||
|
|
||||||
|
def parse_feed_entry(self, entry) -> Dict:
|
||||||
|
"""Parse a feed entry into Directus blog post format"""
|
||||||
|
# Parse the publication date
|
||||||
|
pub_date = None
|
||||||
|
if hasattr(entry, 'published'):
|
||||||
|
try:
|
||||||
|
pub_date = date_parser.parse(entry.published).isoformat()
|
||||||
|
except Exception as e:
|
||||||
|
logger.warning(f"Could not parse date {entry.published}: {e}")
|
||||||
|
|
||||||
|
# Extract description from summary or content
|
||||||
|
description = ""
|
||||||
|
if hasattr(entry, 'summary'):
|
||||||
|
description = entry.summary
|
||||||
|
elif hasattr(entry, 'content') and entry.content:
|
||||||
|
# Take the first content item's value
|
||||||
|
description = entry.content[0].value if entry.content else ""
|
||||||
|
|
||||||
|
# Clean up the description (remove HTML tags if present)
|
||||||
|
# For production, you might want to use a proper HTML parser like BeautifulSoup
|
||||||
|
import re
|
||||||
|
description = re.sub(r'<[^>]+>', '', description)
|
||||||
|
description = description.strip()
|
||||||
|
|
||||||
|
post_data = {
|
||||||
|
"title": entry.title,
|
||||||
|
"description": description,
|
||||||
|
"externalUrl": entry.link,
|
||||||
|
"content": None, # RSS posts don't have content, just external links
|
||||||
|
}
|
||||||
|
|
||||||
|
if pub_date:
|
||||||
|
post_data["published"] = pub_date
|
||||||
|
|
||||||
|
return post_data
|
||||||
|
|
||||||
|
def sync_rss_posts(self) -> Dict[str, int]:
|
||||||
|
"""Main synchronization logic - ensures RSS feed matches Directus"""
|
||||||
|
logger.info("Starting RSS to Directus synchronization")
|
||||||
|
|
||||||
|
# Fetch RSS feed
|
||||||
|
feed = self.fetch_rss_feed()
|
||||||
|
|
||||||
|
# Get existing posts from Directus
|
||||||
|
existing_posts = self.get_existing_posts()
|
||||||
|
|
||||||
|
# Create lookup by external URL
|
||||||
|
existing_by_url = {post["externalUrl"]: post for post in existing_posts}
|
||||||
|
|
||||||
|
stats = {
|
||||||
|
"created": 0,
|
||||||
|
"updated": 0,
|
||||||
|
"deleted": 0,
|
||||||
|
"errors": 0
|
||||||
|
}
|
||||||
|
|
||||||
|
# Track URLs from RSS feed
|
||||||
|
rss_urls = set()
|
||||||
|
|
||||||
|
# Process each RSS entry
|
||||||
|
for entry in feed.entries:
|
||||||
|
try:
|
||||||
|
post_data = self.parse_feed_entry(entry)
|
||||||
|
url = post_data["externalUrl"]
|
||||||
|
rss_urls.add(url)
|
||||||
|
|
||||||
|
if url in existing_by_url:
|
||||||
|
# Update existing post if needed
|
||||||
|
existing_post = existing_by_url[url]
|
||||||
|
|
||||||
|
# Check if update is needed (compare title and description)
|
||||||
|
needs_update = (
|
||||||
|
existing_post["title"] != post_data["title"] or
|
||||||
|
existing_post["description"] != post_data["description"]
|
||||||
|
)
|
||||||
|
|
||||||
|
if needs_update:
|
||||||
|
self.update_blog_post(existing_post["id"], post_data)
|
||||||
|
stats["updated"] += 1
|
||||||
|
else:
|
||||||
|
# Create new post
|
||||||
|
self.create_blog_post(post_data)
|
||||||
|
stats["created"] += 1
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"Error processing RSS entry {entry.link}: {e}")
|
||||||
|
stats["errors"] += 1
|
||||||
|
|
||||||
|
# Delete posts that are no longer in RSS feed
|
||||||
|
for existing_post in existing_posts:
|
||||||
|
if existing_post["externalUrl"] not in rss_urls:
|
||||||
|
try:
|
||||||
|
self.delete_blog_post(existing_post["id"])
|
||||||
|
stats["deleted"] += 1
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"Error deleting post {existing_post['id']}: {e}")
|
||||||
|
stats["errors"] += 1
|
||||||
|
|
||||||
|
logger.info(f"Synchronization complete. Stats: {stats}")
|
||||||
|
return stats
|
||||||
|
|
||||||
|
|
||||||
|
def lambda_handler(event, context):
|
||||||
|
"""AWS Lambda handler function"""
|
||||||
|
try:
|
||||||
|
scraper = BlogScraper()
|
||||||
|
stats = scraper.sync_rss_posts()
|
||||||
|
|
||||||
|
return {
|
||||||
|
'statusCode': 200,
|
||||||
|
'body': {
|
||||||
|
'message': 'RSS synchronization completed successfully',
|
||||||
|
'stats': stats
|
||||||
|
}
|
||||||
|
}
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"Lambda execution failed: {e}")
|
||||||
|
return {
|
||||||
|
'statusCode': 500,
|
||||||
|
'body': {
|
||||||
|
'message': 'RSS synchronization failed',
|
||||||
|
'error': str(e)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def main():
|
||||||
|
"""Main function for local testing"""
|
||||||
|
try:
|
||||||
|
scraper = BlogScraper()
|
||||||
|
stats = scraper.sync_rss_posts()
|
||||||
|
print(f"Synchronization completed with stats: {stats}")
|
||||||
|
except Exception as e:
|
||||||
|
print(f"Error: {e}")
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
main()
|
||||||
11
serverless/blog_scraper/pyproject.toml
Normal file
11
serverless/blog_scraper/pyproject.toml
Normal file
@@ -0,0 +1,11 @@
|
|||||||
|
[project]
|
||||||
|
name = "blog-scraper"
|
||||||
|
version = "0.1.0"
|
||||||
|
description = "Pulls RSS feed from HIBF and stores in Directus CMS"
|
||||||
|
readme = "README.md"
|
||||||
|
requires-python = ">=3.14"
|
||||||
|
dependencies = [
|
||||||
|
"feedparser>=6.0.11",
|
||||||
|
"requests>=2.32.0",
|
||||||
|
"python-dateutil>=2.9.0"
|
||||||
|
]
|
||||||
126
serverless/blog_scraper/uv.lock
generated
Normal file
126
serverless/blog_scraper/uv.lock
generated
Normal file
@@ -0,0 +1,126 @@
|
|||||||
|
version = 1
|
||||||
|
revision = 3
|
||||||
|
requires-python = ">=3.14"
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "blog-scraper"
|
||||||
|
version = "0.1.0"
|
||||||
|
source = { virtual = "." }
|
||||||
|
dependencies = [
|
||||||
|
{ name = "feedparser" },
|
||||||
|
{ name = "python-dateutil" },
|
||||||
|
{ name = "requests" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[package.metadata]
|
||||||
|
requires-dist = [
|
||||||
|
{ name = "feedparser", specifier = ">=6.0.11" },
|
||||||
|
{ name = "python-dateutil", specifier = ">=2.9.0" },
|
||||||
|
{ name = "requests", specifier = ">=2.32.0" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "certifi"
|
||||||
|
version = "2025.11.12"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/a2/8c/58f469717fa48465e4a50c014a0400602d3c437d7c0c468e17ada824da3a/certifi-2025.11.12.tar.gz", hash = "sha256:d8ab5478f2ecd78af242878415affce761ca6bc54a22a27e026d7c25357c3316", size = 160538, upload-time = "2025-11-12T02:54:51.517Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/70/7d/9bc192684cea499815ff478dfcdc13835ddf401365057044fb721ec6bddb/certifi-2025.11.12-py3-none-any.whl", hash = "sha256:97de8790030bbd5c2d96b7ec782fc2f7820ef8dba6db909ccf95449f2d062d4b", size = 159438, upload-time = "2025-11-12T02:54:49.735Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "charset-normalizer"
|
||||||
|
version = "3.4.4"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/13/69/33ddede1939fdd074bce5434295f38fae7136463422fe4fd3e0e89b98062/charset_normalizer-3.4.4.tar.gz", hash = "sha256:94537985111c35f28720e43603b8e7b43a6ecfb2ce1d3058bbe955b73404e21a", size = 129418, upload-time = "2025-10-14T04:42:32.879Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/2a/35/7051599bd493e62411d6ede36fd5af83a38f37c4767b92884df7301db25d/charset_normalizer-3.4.4-cp314-cp314-macosx_10_13_universal2.whl", hash = "sha256:da3326d9e65ef63a817ecbcc0df6e94463713b754fe293eaa03da99befb9a5bd", size = 207746, upload-time = "2025-10-14T04:41:33.773Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/10/9a/97c8d48ef10d6cd4fcead2415523221624bf58bcf68a802721a6bc807c8f/charset_normalizer-3.4.4-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:8af65f14dc14a79b924524b1e7fffe304517b2bff5a58bf64f30b98bbc5079eb", size = 147889, upload-time = "2025-10-14T04:41:34.897Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/10/bf/979224a919a1b606c82bd2c5fa49b5c6d5727aa47b4312bb27b1734f53cd/charset_normalizer-3.4.4-cp314-cp314-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:74664978bb272435107de04e36db5a9735e78232b85b77d45cfb38f758efd33e", size = 143641, upload-time = "2025-10-14T04:41:36.116Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/ba/33/0ad65587441fc730dc7bd90e9716b30b4702dc7b617e6ba4997dc8651495/charset_normalizer-3.4.4-cp314-cp314-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:752944c7ffbfdd10c074dc58ec2d5a8a4cd9493b314d367c14d24c17684ddd14", size = 160779, upload-time = "2025-10-14T04:41:37.229Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/67/ed/331d6b249259ee71ddea93f6f2f0a56cfebd46938bde6fcc6f7b9a3d0e09/charset_normalizer-3.4.4-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:d1f13550535ad8cff21b8d757a3257963e951d96e20ec82ab44bc64aeb62a191", size = 159035, upload-time = "2025-10-14T04:41:38.368Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/67/ff/f6b948ca32e4f2a4576aa129d8bed61f2e0543bf9f5f2b7fc3758ed005c9/charset_normalizer-3.4.4-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:ecaae4149d99b1c9e7b88bb03e3221956f68fd6d50be2ef061b2381b61d20838", size = 152542, upload-time = "2025-10-14T04:41:39.862Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/16/85/276033dcbcc369eb176594de22728541a925b2632f9716428c851b149e83/charset_normalizer-3.4.4-cp314-cp314-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:cb6254dc36b47a990e59e1068afacdcd02958bdcce30bb50cc1700a8b9d624a6", size = 149524, upload-time = "2025-10-14T04:41:41.319Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/9e/f2/6a2a1f722b6aba37050e626530a46a68f74e63683947a8acff92569f979a/charset_normalizer-3.4.4-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:c8ae8a0f02f57a6e61203a31428fa1d677cbe50c93622b4149d5c0f319c1d19e", size = 150395, upload-time = "2025-10-14T04:41:42.539Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/60/bb/2186cb2f2bbaea6338cad15ce23a67f9b0672929744381e28b0592676824/charset_normalizer-3.4.4-cp314-cp314-musllinux_1_2_armv7l.whl", hash = "sha256:47cc91b2f4dd2833fddaedd2893006b0106129d4b94fdb6af1f4ce5a9965577c", size = 143680, upload-time = "2025-10-14T04:41:43.661Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/7d/a5/bf6f13b772fbb2a90360eb620d52ed8f796f3c5caee8398c3b2eb7b1c60d/charset_normalizer-3.4.4-cp314-cp314-musllinux_1_2_ppc64le.whl", hash = "sha256:82004af6c302b5d3ab2cfc4cc5f29db16123b1a8417f2e25f9066f91d4411090", size = 162045, upload-time = "2025-10-14T04:41:44.821Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/df/c5/d1be898bf0dc3ef9030c3825e5d3b83f2c528d207d246cbabe245966808d/charset_normalizer-3.4.4-cp314-cp314-musllinux_1_2_riscv64.whl", hash = "sha256:2b7d8f6c26245217bd2ad053761201e9f9680f8ce52f0fcd8d0755aeae5b2152", size = 149687, upload-time = "2025-10-14T04:41:46.442Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/a5/42/90c1f7b9341eef50c8a1cb3f098ac43b0508413f33affd762855f67a410e/charset_normalizer-3.4.4-cp314-cp314-musllinux_1_2_s390x.whl", hash = "sha256:799a7a5e4fb2d5898c60b640fd4981d6a25f1c11790935a44ce38c54e985f828", size = 160014, upload-time = "2025-10-14T04:41:47.631Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/76/be/4d3ee471e8145d12795ab655ece37baed0929462a86e72372fd25859047c/charset_normalizer-3.4.4-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:99ae2cffebb06e6c22bdc25801d7b30f503cc87dbd283479e7b606f70aff57ec", size = 154044, upload-time = "2025-10-14T04:41:48.81Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/b0/6f/8f7af07237c34a1defe7defc565a9bc1807762f672c0fde711a4b22bf9c0/charset_normalizer-3.4.4-cp314-cp314-win32.whl", hash = "sha256:f9d332f8c2a2fcbffe1378594431458ddbef721c1769d78e2cbc06280d8155f9", size = 99940, upload-time = "2025-10-14T04:41:49.946Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/4b/51/8ade005e5ca5b0d80fb4aff72a3775b325bdc3d27408c8113811a7cbe640/charset_normalizer-3.4.4-cp314-cp314-win_amd64.whl", hash = "sha256:8a6562c3700cce886c5be75ade4a5db4214fda19fede41d9792d100288d8f94c", size = 107104, upload-time = "2025-10-14T04:41:51.051Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/da/5f/6b8f83a55bb8278772c5ae54a577f3099025f9ade59d0136ac24a0df4bde/charset_normalizer-3.4.4-cp314-cp314-win_arm64.whl", hash = "sha256:de00632ca48df9daf77a2c65a484531649261ec9f25489917f09e455cb09ddb2", size = 100743, upload-time = "2025-10-14T04:41:52.122Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/0a/4c/925909008ed5a988ccbb72dcc897407e5d6d3bd72410d69e051fc0c14647/charset_normalizer-3.4.4-py3-none-any.whl", hash = "sha256:7a32c560861a02ff789ad905a2fe94e3f840803362c84fecf1851cb4cf3dc37f", size = 53402, upload-time = "2025-10-14T04:42:31.76Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "feedparser"
|
||||||
|
version = "6.0.12"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
dependencies = [
|
||||||
|
{ name = "sgmllib3k" },
|
||||||
|
]
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/dc/79/db7edb5e77d6dfbc54d7d9df72828be4318275b2e580549ff45a962f6461/feedparser-6.0.12.tar.gz", hash = "sha256:64f76ce90ae3e8ef5d1ede0f8d3b50ce26bcce71dd8ae5e82b1cd2d4a5f94228", size = 286579, upload-time = "2025-09-10T13:33:59.486Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/4e/eb/c96d64137e29ae17d83ad2552470bafe3a7a915e85434d9942077d7fd011/feedparser-6.0.12-py3-none-any.whl", hash = "sha256:6bbff10f5a52662c00a2e3f86a38928c37c48f77b3c511aedcd51de933549324", size = 81480, upload-time = "2025-09-10T13:33:58.022Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "idna"
|
||||||
|
version = "3.11"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/6f/6d/0703ccc57f3a7233505399edb88de3cbd678da106337b9fcde432b65ed60/idna-3.11.tar.gz", hash = "sha256:795dafcc9c04ed0c1fb032c2aa73654d8e8c5023a7df64a53f39190ada629902", size = 194582, upload-time = "2025-10-12T14:55:20.501Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/0e/61/66938bbb5fc52dbdf84594873d5b51fb1f7c7794e9c0f5bd885f30bc507b/idna-3.11-py3-none-any.whl", hash = "sha256:771a87f49d9defaf64091e6e6fe9c18d4833f140bd19464795bc32d966ca37ea", size = 71008, upload-time = "2025-10-12T14:55:18.883Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "python-dateutil"
|
||||||
|
version = "2.9.0.post0"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
dependencies = [
|
||||||
|
{ name = "six" },
|
||||||
|
]
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/66/c0/0c8b6ad9f17a802ee498c46e004a0eb49bc148f2fd230864601a86dcf6db/python-dateutil-2.9.0.post0.tar.gz", hash = "sha256:37dd54208da7e1cd875388217d5e00ebd4179249f90fb72437e91a35459a0ad3", size = 342432, upload-time = "2024-03-01T18:36:20.211Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/ec/57/56b9bcc3c9c6a792fcbaf139543cee77261f3651ca9da0c93f5c1221264b/python_dateutil-2.9.0.post0-py2.py3-none-any.whl", hash = "sha256:a8b2bc7bffae282281c8140a97d3aa9c14da0b136dfe83f850eea9a5f7470427", size = 229892, upload-time = "2024-03-01T18:36:18.57Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "requests"
|
||||||
|
version = "2.32.5"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
dependencies = [
|
||||||
|
{ name = "certifi" },
|
||||||
|
{ name = "charset-normalizer" },
|
||||||
|
{ name = "idna" },
|
||||||
|
{ name = "urllib3" },
|
||||||
|
]
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/c9/74/b3ff8e6c8446842c3f5c837e9c3dfcfe2018ea6ecef224c710c85ef728f4/requests-2.32.5.tar.gz", hash = "sha256:dbba0bac56e100853db0ea71b82b4dfd5fe2bf6d3754a8893c3af500cec7d7cf", size = 134517, upload-time = "2025-08-18T20:46:02.573Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/1e/db/4254e3eabe8020b458f1a747140d32277ec7a271daf1d235b70dc0b4e6e3/requests-2.32.5-py3-none-any.whl", hash = "sha256:2462f94637a34fd532264295e186976db0f5d453d1cdd31473c85a6a161affb6", size = 64738, upload-time = "2025-08-18T20:46:00.542Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "sgmllib3k"
|
||||||
|
version = "1.0.0"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/9e/bd/3704a8c3e0942d711c1299ebf7b9091930adae6675d7c8f476a7ce48653c/sgmllib3k-1.0.0.tar.gz", hash = "sha256:7868fb1c8bfa764c1ac563d3cf369c381d1325d36124933a726f29fcdaa812e9", size = 5750, upload-time = "2010-08-24T14:33:52.445Z" }
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "six"
|
||||||
|
version = "1.17.0"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/94/e7/b2c673351809dca68a0e064b6af791aa332cf192da575fd474ed7d6f16a2/six-1.17.0.tar.gz", hash = "sha256:ff70335d468e7eb6ec65b95b99d3a2836546063f63acc5171de367e834932a81", size = 34031, upload-time = "2024-12-04T17:35:28.174Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/b7/ce/149a00dd41f10bc29e5921b496af8b574d8413afcd5e30dfa0ed46c2cc5e/six-1.17.0-py2.py3-none-any.whl", hash = "sha256:4721f391ed90541fddacab5acf947aa0d3dc7d27b2e1e8eda2be8970586c3274", size = 11050, upload-time = "2024-12-04T17:35:26.475Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "urllib3"
|
||||||
|
version = "2.6.2"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/1e/24/a2a2ed9addd907787d7aa0355ba36a6cadf1768b934c652ea78acbd59dcd/urllib3-2.6.2.tar.gz", hash = "sha256:016f9c98bb7e98085cb2b4b17b87d2c702975664e4f060c6532e64d1c1a5e797", size = 432930, upload-time = "2025-12-11T15:56:40.252Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/6d/b9/4095b668ea3678bf6a0af005527f39de12fb026516fb3df17495a733b7f8/urllib3-2.6.2-py3-none-any.whl", hash = "sha256:ec21cddfe7724fc7cb4ba4bea7aa8e2ef36f607a4bab81aa6ce42a13dc3f03dd", size = 131182, upload-time = "2025-12-11T15:56:38.584Z" },
|
||||||
|
]
|
||||||
@@ -18,6 +18,15 @@ module "alpr_cache" {
|
|||||||
sns_topic_arn = aws_sns_topic.lambda_alarms.arn
|
sns_topic_arn = aws_sns_topic.lambda_alarms.arn
|
||||||
}
|
}
|
||||||
|
|
||||||
|
module "blog_scraper" {
|
||||||
|
module_name = "blog_scraper"
|
||||||
|
source = "./modules/blog_scraper"
|
||||||
|
rate = "rate(30 minutes)"
|
||||||
|
sns_topic_arn = aws_sns_topic.lambda_alarms.arn
|
||||||
|
directus_base_url = var.directus_base_url
|
||||||
|
directus_api_token = var.directus_api_token
|
||||||
|
}
|
||||||
|
|
||||||
resource "aws_sns_topic" "lambda_alarms" {
|
resource "aws_sns_topic" "lambda_alarms" {
|
||||||
name = "lambda_alarms_topic"
|
name = "lambda_alarms_topic"
|
||||||
}
|
}
|
||||||
|
|||||||
133
terraform/modules/blog_scraper/main.tf
Normal file
133
terraform/modules/blog_scraper/main.tf
Normal file
@@ -0,0 +1,133 @@
|
|||||||
|
resource "aws_iam_role" "lambda_role" {
|
||||||
|
name = "blog_scraper_lambda_role"
|
||||||
|
|
||||||
|
assume_role_policy = jsonencode({
|
||||||
|
Version = "2012-10-17"
|
||||||
|
Statement = [
|
||||||
|
{
|
||||||
|
Action = "sts:AssumeRole"
|
||||||
|
Effect = "Allow"
|
||||||
|
Principal = {
|
||||||
|
Service = "lambda.amazonaws.com"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
]
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
resource "aws_iam_policy" "lambda_basic_execution_policy" {
|
||||||
|
name = "blog_scraper_lambda_basic_execution_policy"
|
||||||
|
description = "Basic execution policy for blog scraper Lambda function"
|
||||||
|
|
||||||
|
policy = jsonencode({
|
||||||
|
Version = "2012-10-17"
|
||||||
|
Statement = [
|
||||||
|
{
|
||||||
|
Effect = "Allow"
|
||||||
|
Action = [
|
||||||
|
"logs:CreateLogGroup",
|
||||||
|
"logs:CreateLogStream",
|
||||||
|
"logs:PutLogEvents"
|
||||||
|
]
|
||||||
|
Resource = "arn:aws:logs:*:*:*"
|
||||||
|
}
|
||||||
|
]
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
resource "aws_iam_role_policy_attachment" "lambda_basic_execution_attachment" {
|
||||||
|
role = aws_iam_role.lambda_role.name
|
||||||
|
policy_arn = aws_iam_policy.lambda_basic_execution_policy.arn
|
||||||
|
}
|
||||||
|
|
||||||
|
# Install dependencies using uv (since it's a uv project)
|
||||||
|
resource "null_resource" "uv_install" {
|
||||||
|
provisioner "local-exec" {
|
||||||
|
command = <<EOT
|
||||||
|
cd ${path.root}/../serverless/${var.module_name}
|
||||||
|
# Create build directory (ignored by git)
|
||||||
|
rm -rf .build
|
||||||
|
mkdir -p .build
|
||||||
|
|
||||||
|
# Install dependencies using uv into build directory
|
||||||
|
uv pip install --system --target .build -r pyproject.toml
|
||||||
|
|
||||||
|
# Copy the main.py file to the build directory
|
||||||
|
cp main.py .build/
|
||||||
|
EOT
|
||||||
|
}
|
||||||
|
|
||||||
|
triggers = {
|
||||||
|
# Re-run if pyproject.toml or main.py changes
|
||||||
|
pyproject_hash = filemd5("${path.root}/../serverless/${var.module_name}/pyproject.toml")
|
||||||
|
main_py_hash = filemd5("${path.root}/../serverless/${var.module_name}/main.py")
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
data "archive_file" "python_lambda_package" {
|
||||||
|
type = "zip"
|
||||||
|
source_dir = "${path.root}/../serverless/${var.module_name}/.build"
|
||||||
|
output_path = "${path.root}/../serverless/${var.module_name}/lambda.zip"
|
||||||
|
|
||||||
|
depends_on = [null_resource.uv_install]
|
||||||
|
}
|
||||||
|
|
||||||
|
resource "aws_lambda_function" "blog_scraper_lambda" {
|
||||||
|
filename = data.archive_file.python_lambda_package.output_path
|
||||||
|
function_name = var.module_name
|
||||||
|
role = aws_iam_role.lambda_role.arn
|
||||||
|
handler = "main.lambda_handler"
|
||||||
|
runtime = "python3.14"
|
||||||
|
source_code_hash = data.archive_file.python_lambda_package.output_base64sha256
|
||||||
|
timeout = 300
|
||||||
|
|
||||||
|
environment {
|
||||||
|
variables = {
|
||||||
|
DIRECTUS_BASE_URL = var.directus_base_url
|
||||||
|
DIRECTUS_API_TOKEN = var.directus_api_token
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
resource "aws_cloudwatch_event_rule" "lambda_rule" {
|
||||||
|
name = "${var.module_name}_rule"
|
||||||
|
description = "Rule to trigger ${var.module_name} lambda every 30 minutes"
|
||||||
|
schedule_expression = var.rate
|
||||||
|
}
|
||||||
|
|
||||||
|
resource "aws_cloudwatch_event_target" "lambda_target" {
|
||||||
|
target_id = "${var.module_name}_target"
|
||||||
|
rule = aws_cloudwatch_event_rule.lambda_rule.name
|
||||||
|
arn = aws_lambda_function.blog_scraper_lambda.arn
|
||||||
|
}
|
||||||
|
|
||||||
|
resource "aws_lambda_permission" "allow_cloudwatch_to_call_lambda" {
|
||||||
|
statement_id = "AllowExecutionFromCloudWatch"
|
||||||
|
action = "lambda:InvokeFunction"
|
||||||
|
function_name = aws_lambda_function.blog_scraper_lambda.function_name
|
||||||
|
principal = "events.amazonaws.com"
|
||||||
|
source_arn = aws_cloudwatch_event_rule.lambda_rule.arn
|
||||||
|
}
|
||||||
|
|
||||||
|
resource "aws_cloudwatch_log_group" "lambda_log_group" {
|
||||||
|
name = "/aws/lambda/${var.module_name}"
|
||||||
|
retention_in_days = 14
|
||||||
|
}
|
||||||
|
|
||||||
|
# CloudWatch alarm for Lambda errors
|
||||||
|
resource "aws_cloudwatch_metric_alarm" "lambda_error_alarm" {
|
||||||
|
alarm_name = "${var.module_name}_error_alarm"
|
||||||
|
comparison_operator = "GreaterThanThreshold"
|
||||||
|
evaluation_periods = "2"
|
||||||
|
metric_name = "Errors"
|
||||||
|
namespace = "AWS/Lambda"
|
||||||
|
period = "300"
|
||||||
|
statistic = "Sum"
|
||||||
|
threshold = "0"
|
||||||
|
alarm_description = "This metric monitors lambda errors for ${var.module_name}"
|
||||||
|
alarm_actions = [var.sns_topic_arn]
|
||||||
|
|
||||||
|
dimensions = {
|
||||||
|
FunctionName = aws_lambda_function.blog_scraper_lambda.function_name
|
||||||
|
}
|
||||||
|
}
|
||||||
14
terraform/modules/blog_scraper/outputs.tf
Normal file
14
terraform/modules/blog_scraper/outputs.tf
Normal file
@@ -0,0 +1,14 @@
|
|||||||
|
output "lambda_function_arn" {
|
||||||
|
description = "ARN of the blog scraper Lambda function"
|
||||||
|
value = aws_lambda_function.blog_scraper_lambda.arn
|
||||||
|
}
|
||||||
|
|
||||||
|
output "lambda_function_name" {
|
||||||
|
description = "Name of the blog scraper Lambda function"
|
||||||
|
value = aws_lambda_function.blog_scraper_lambda.function_name
|
||||||
|
}
|
||||||
|
|
||||||
|
output "cloudwatch_event_rule_arn" {
|
||||||
|
description = "ARN of the CloudWatch Event Rule"
|
||||||
|
value = aws_cloudwatch_event_rule.lambda_rule.arn
|
||||||
|
}
|
||||||
28
terraform/modules/blog_scraper/variables.tf
Normal file
28
terraform/modules/blog_scraper/variables.tf
Normal file
@@ -0,0 +1,28 @@
|
|||||||
|
variable "module_name" {
|
||||||
|
description = "Name of the module"
|
||||||
|
type = string
|
||||||
|
default = "blog_scraper"
|
||||||
|
}
|
||||||
|
|
||||||
|
variable "rate" {
|
||||||
|
description = "Rate expression for CloudWatch Events rule"
|
||||||
|
type = string
|
||||||
|
default = "rate(30 minutes)"
|
||||||
|
}
|
||||||
|
|
||||||
|
variable "sns_topic_arn" {
|
||||||
|
description = "SNS topic ARN for Lambda alarms"
|
||||||
|
type = string
|
||||||
|
}
|
||||||
|
|
||||||
|
variable "directus_base_url" {
|
||||||
|
description = "Base URL for Directus CMS"
|
||||||
|
type = string
|
||||||
|
default = "https://cms.deflock.me"
|
||||||
|
}
|
||||||
|
|
||||||
|
variable "directus_api_token" {
|
||||||
|
description = "API token for Directus CMS"
|
||||||
|
type = string
|
||||||
|
sensitive = true
|
||||||
|
}
|
||||||
@@ -12,3 +12,14 @@ variable "alarm_phone_number" {
|
|||||||
description = "Phone number to receive alarm notifications"
|
description = "Phone number to receive alarm notifications"
|
||||||
# intentionally left blank since this file is checked into git
|
# intentionally left blank since this file is checked into git
|
||||||
}
|
}
|
||||||
|
|
||||||
|
variable "directus_base_url" {
|
||||||
|
description = "Base URL for Directus CMS"
|
||||||
|
default = "https://cms.deflock.me"
|
||||||
|
}
|
||||||
|
|
||||||
|
variable "directus_api_token" {
|
||||||
|
description = "API token for Directus CMS"
|
||||||
|
sensitive = true
|
||||||
|
# intentionally left blank since this file is checked into git
|
||||||
|
}
|
||||||
|
|||||||
@@ -72,5 +72,9 @@
|
|||||||
<loc>https://deflock.me/store</loc>
|
<loc>https://deflock.me/store</loc>
|
||||||
<changefreq>daily</changefreq>
|
<changefreq>daily</changefreq>
|
||||||
</url>
|
</url>
|
||||||
|
<url>
|
||||||
|
<loc>https://deflock.me/blog</loc>
|
||||||
|
<changefreq>daily</changefreq>
|
||||||
|
</url>
|
||||||
</urlset>
|
</urlset>
|
||||||
|
|
||||||
|
|||||||
@@ -37,11 +37,12 @@ const items = [
|
|||||||
{ title: 'Home', icon: 'mdi-home', to: '/' },
|
{ title: 'Home', icon: 'mdi-home', to: '/' },
|
||||||
{ title: 'Map', icon: 'mdi-map', to: '/map' },
|
{ title: 'Map', icon: 'mdi-map', to: '/map' },
|
||||||
{ title: 'Learn', icon: 'mdi-school', to: '/what-is-an-alpr' },
|
{ title: 'Learn', icon: 'mdi-school', to: '/what-is-an-alpr' },
|
||||||
{ title: 'Store', icon: 'mdi-shopping', to: '/store' },
|
{ title: 'News', icon: 'mdi-newspaper', to: '/blog' },
|
||||||
]
|
]
|
||||||
|
|
||||||
const contributeItems = [
|
const contributeItems = [
|
||||||
{ title: 'Submit Cameras', icon: 'mdi-map-marker-plus', to: '/report' },
|
{ title: 'Submit Cameras', icon: 'mdi-map-marker-plus', to: '/report' },
|
||||||
|
{ title: 'Hang Signs', icon: 'mdi-sign-direction', to: '/store' },
|
||||||
{ title: 'Public Records', icon: 'mdi-file-document', to: '/foia' },
|
{ title: 'Public Records', icon: 'mdi-file-document', to: '/foia' },
|
||||||
{ title: 'City Council', icon: 'mdi-account-voice', to: '/council' },
|
{ title: 'City Council', icon: 'mdi-account-voice', to: '/council' },
|
||||||
]
|
]
|
||||||
@@ -176,8 +177,8 @@ watch(() => theme.global.name.value, (newTheme) => {
|
|||||||
|
|
||||||
<v-spacer class="d-md-none" />
|
<v-spacer class="d-md-none" />
|
||||||
|
|
||||||
<v-btn icon>
|
<v-btn icon @click="toggleTheme" aria-label="Toggle Theme">
|
||||||
<v-icon @click="toggleTheme" aria-label="Toggle Theme">mdi-theme-light-dark</v-icon>
|
<v-icon>mdi-theme-light-dark</v-icon>
|
||||||
</v-btn>
|
</v-btn>
|
||||||
</v-app-bar>
|
</v-app-bar>
|
||||||
|
|
||||||
|
|||||||
@@ -8,7 +8,7 @@
|
|||||||
>
|
>
|
||||||
<v-col cols="12" md="8">
|
<v-col cols="12" md="8">
|
||||||
<h1 class="mb-4">{{ title }}</h1>
|
<h1 class="mb-4">{{ title }}</h1>
|
||||||
<p class="mb-4">
|
<p class="mb-4 px-8">
|
||||||
{{ description }}
|
{{ description }}
|
||||||
</p>
|
</p>
|
||||||
<v-btn
|
<v-btn
|
||||||
|
|||||||
@@ -159,6 +159,22 @@ const router = createRouter({
|
|||||||
title: 'Store | DeFlock'
|
title: 'Store | DeFlock'
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
|
{
|
||||||
|
path: '/blog',
|
||||||
|
name: 'blog',
|
||||||
|
component: () => import('../views/Blog.vue'),
|
||||||
|
meta: {
|
||||||
|
title: 'News | DeFlock'
|
||||||
|
}
|
||||||
|
},
|
||||||
|
{
|
||||||
|
path: '/blog/:id',
|
||||||
|
name: 'blog-post',
|
||||||
|
component: () => import('../views/BlogPost.vue'),
|
||||||
|
meta: {
|
||||||
|
title: 'Blog Post | DeFlock'
|
||||||
|
}
|
||||||
|
},
|
||||||
{
|
{
|
||||||
path: '/:pathMatch(.*)*',
|
path: '/:pathMatch(.*)*',
|
||||||
name: 'not-found',
|
name: 'not-found',
|
||||||
|
|||||||
80
webapp/src/services/blogService.ts
Normal file
80
webapp/src/services/blogService.ts
Normal file
@@ -0,0 +1,80 @@
|
|||||||
|
import axios from "axios";
|
||||||
|
|
||||||
|
export interface BlogPost {
|
||||||
|
id: number;
|
||||||
|
published: string;
|
||||||
|
description: string;
|
||||||
|
content: string | null;
|
||||||
|
title: string;
|
||||||
|
externalUrl?: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface BlogResponse {
|
||||||
|
data: BlogPost[];
|
||||||
|
meta?: {
|
||||||
|
total_count: number;
|
||||||
|
filter_count: number;
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface BlogQueryParams {
|
||||||
|
limit?: number;
|
||||||
|
offset?: number;
|
||||||
|
page?: number;
|
||||||
|
sort?: string;
|
||||||
|
fields?: string[];
|
||||||
|
}
|
||||||
|
|
||||||
|
const CMS_BASE_URL = "https://cms.deflock.me";
|
||||||
|
|
||||||
|
const blogApiService = axios.create({
|
||||||
|
baseURL: CMS_BASE_URL,
|
||||||
|
timeout: 10000,
|
||||||
|
});
|
||||||
|
|
||||||
|
export const blogService = {
|
||||||
|
async getBlogPosts(params: BlogQueryParams = {}): Promise<BlogResponse> {
|
||||||
|
const queryParams = new URLSearchParams();
|
||||||
|
|
||||||
|
// Set default sorting by newest first
|
||||||
|
const sort = params.sort || "-date_created";
|
||||||
|
queryParams.append("sort", sort);
|
||||||
|
|
||||||
|
// Set pagination parameters
|
||||||
|
if (params.limit) {
|
||||||
|
queryParams.append("limit", params.limit.toString());
|
||||||
|
}
|
||||||
|
if (params.offset) {
|
||||||
|
queryParams.append("offset", params.offset.toString());
|
||||||
|
}
|
||||||
|
if (params.page) {
|
||||||
|
queryParams.append("page", params.page.toString());
|
||||||
|
}
|
||||||
|
|
||||||
|
// Set fields if specified
|
||||||
|
if (params.fields && params.fields.length > 0) {
|
||||||
|
queryParams.append("fields", params.fields.join(","));
|
||||||
|
}
|
||||||
|
|
||||||
|
// Request metadata for pagination
|
||||||
|
queryParams.append("meta", "total_count,filter_count");
|
||||||
|
|
||||||
|
try {
|
||||||
|
const response = await blogApiService.get(`/items/blog?${queryParams.toString()}`);
|
||||||
|
return response.data;
|
||||||
|
} catch (error) {
|
||||||
|
console.error("Error fetching blog posts:", error);
|
||||||
|
throw new Error("Failed to fetch blog posts");
|
||||||
|
}
|
||||||
|
},
|
||||||
|
|
||||||
|
async getBlogPost(id: number): Promise<BlogPost> {
|
||||||
|
try {
|
||||||
|
const response = await blogApiService.get(`/items/blog/${id}?t=${Date.now()}`);
|
||||||
|
return response.data.data;
|
||||||
|
} catch (error) {
|
||||||
|
console.error(`Error fetching blog post ${id}:`, error);
|
||||||
|
throw new Error(`Failed to fetch blog post ${id}`);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
};
|
||||||
192
webapp/src/views/Blog.vue
Normal file
192
webapp/src/views/Blog.vue
Normal file
@@ -0,0 +1,192 @@
|
|||||||
|
<template>
|
||||||
|
<DefaultLayout>
|
||||||
|
<template #header>
|
||||||
|
<Hero
|
||||||
|
title="DeFlock News"
|
||||||
|
description="The latest news on LPRs and surveillance from us and our partners."
|
||||||
|
gradient="linear-gradient(135deg, rgb(var(--v-theme-primary)) 0%, rgb(var(--v-theme-secondary)) 100%)"
|
||||||
|
/>
|
||||||
|
</template>
|
||||||
|
|
||||||
|
<v-container class="py-8">
|
||||||
|
<!-- Loading State -->
|
||||||
|
<div v-if="loading" class="d-flex justify-center align-center" style="min-height: 200px;">
|
||||||
|
<v-progress-circular indeterminate size="64" color="primary"></v-progress-circular>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- Error State -->
|
||||||
|
<v-alert v-else-if="error" type="error" class="mb-6">
|
||||||
|
{{ error }}
|
||||||
|
<template #append>
|
||||||
|
<v-btn variant="outlined" size="small" @click="fetchBlogPosts">
|
||||||
|
Retry
|
||||||
|
</v-btn>
|
||||||
|
</template>
|
||||||
|
</v-alert>
|
||||||
|
|
||||||
|
<!-- Blog Posts List -->
|
||||||
|
<div v-else>
|
||||||
|
<div v-if="blogPosts.length > 0" class="mx-auto" style="max-width: 900px;">
|
||||||
|
<article
|
||||||
|
v-for="post in blogPosts"
|
||||||
|
:key="post.id"
|
||||||
|
class="mb-8"
|
||||||
|
>
|
||||||
|
<v-card
|
||||||
|
class="rounded-xl transition-all cursor-pointer mx-4 mx-sm-0"
|
||||||
|
:href="post.externalUrl || `/blog/${post.id}`"
|
||||||
|
:target="post.externalUrl ? '_blank' : undefined"
|
||||||
|
:to="post.externalUrl ? undefined : `/blog/${post.id}`"
|
||||||
|
flat
|
||||||
|
>
|
||||||
|
<v-card-text class="pa-8">
|
||||||
|
<div class="mb-3">
|
||||||
|
<h2 class="font-weight-medium mb-0">{{ post.title }}</h2>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<p class="text-caption text-uppercase font-weight-medium text-medium-emphasis mb-4">
|
||||||
|
{{ formatDate(post.published) }}
|
||||||
|
</p>
|
||||||
|
|
||||||
|
<p class="text-body-1 mb-6" style="line-height: 1.6;">
|
||||||
|
{{ post.description }}
|
||||||
|
</p>
|
||||||
|
|
||||||
|
<div class="d-flex align-center justify-space-between">
|
||||||
|
<span class="text-body-2 font-weight-medium text-primary">
|
||||||
|
{{ post.externalUrl ? `Read on ${getExternalOrigin(post.externalUrl)}` : 'Read full article' }}
|
||||||
|
</span>
|
||||||
|
<v-icon
|
||||||
|
size="20"
|
||||||
|
color="primary"
|
||||||
|
:icon="post.externalUrl ? 'mdi-open-in-new' : 'mdi-arrow-right'"
|
||||||
|
></v-icon>
|
||||||
|
</div>
|
||||||
|
</v-card-text>
|
||||||
|
</v-card>
|
||||||
|
</article>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- Empty State -->
|
||||||
|
<div v-else class="text-center py-12">
|
||||||
|
<v-icon size="64" color="grey-lighten-1" class="mb-4">mdi-post-outline</v-icon>
|
||||||
|
<h3 class="text-h5 text-grey-darken-1 mb-2">No blog posts yet</h3>
|
||||||
|
<p class="text-body-1 text-grey-darken-2">Check back later for updates!</p>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- Pagination -->
|
||||||
|
<div v-if="blogPosts.length > 0" class="d-flex justify-center mt-8">
|
||||||
|
<v-pagination
|
||||||
|
class="pl-0"
|
||||||
|
v-model="currentPage"
|
||||||
|
:length="totalPages > 0 ? totalPages : 1"
|
||||||
|
:total-visible="3"
|
||||||
|
:disabled="totalPages <= 1"
|
||||||
|
@update:model-value="onPageChange"
|
||||||
|
></v-pagination>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</v-container>
|
||||||
|
</DefaultLayout>
|
||||||
|
</template>
|
||||||
|
|
||||||
|
<script setup lang="ts">
|
||||||
|
import { ref, onMounted, computed, watch } from 'vue';
|
||||||
|
import { useRoute, useRouter } from 'vue-router';
|
||||||
|
import Hero from '@/components/layout/Hero.vue';
|
||||||
|
import DefaultLayout from '@/layouts/DefaultLayout.vue';
|
||||||
|
import { blogService, type BlogPost } from '@/services/blogService';
|
||||||
|
|
||||||
|
// Router
|
||||||
|
const route = useRoute();
|
||||||
|
const router = useRouter();
|
||||||
|
|
||||||
|
// Reactive state
|
||||||
|
const blogPosts = ref<BlogPost[]>([]);
|
||||||
|
const loading = ref(false);
|
||||||
|
const error = ref<string | null>(null);
|
||||||
|
const totalCount = ref(0);
|
||||||
|
const postsPerPage = 5; // Fewer posts per page for larger cards
|
||||||
|
|
||||||
|
// Current page from route query parameter
|
||||||
|
const currentPage = computed({
|
||||||
|
get: () => {
|
||||||
|
const page = parseInt(route.query.page as string) || 1;
|
||||||
|
return page > 0 ? page : 1;
|
||||||
|
},
|
||||||
|
set: (page: number) => {
|
||||||
|
router.push({
|
||||||
|
path: route.path,
|
||||||
|
query: { ...route.query, page: page > 1 ? page.toString() : undefined }
|
||||||
|
});
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
// Computed properties
|
||||||
|
const totalPages = computed(() => Math.ceil(totalCount.value / postsPerPage));
|
||||||
|
|
||||||
|
// Methods
|
||||||
|
const formatDate = (dateString: string) => {
|
||||||
|
const date = new Date(dateString);
|
||||||
|
return date.toLocaleDateString('en-US', {
|
||||||
|
year: 'numeric',
|
||||||
|
month: 'long',
|
||||||
|
day: 'numeric'
|
||||||
|
});
|
||||||
|
};
|
||||||
|
|
||||||
|
const getExternalOrigin = (url: string) => {
|
||||||
|
try {
|
||||||
|
const urlObj = new URL(url);
|
||||||
|
return urlObj.hostname;
|
||||||
|
} catch {
|
||||||
|
return 'external site';
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
const fetchBlogPosts = async (page = 1) => {
|
||||||
|
loading.value = true;
|
||||||
|
error.value = null;
|
||||||
|
|
||||||
|
try {
|
||||||
|
const response = await blogService.getBlogPosts({
|
||||||
|
limit: postsPerPage,
|
||||||
|
page: page,
|
||||||
|
sort: '-published',
|
||||||
|
fields: ['id', 'title', 'description', 'published', 'externalUrl']
|
||||||
|
});
|
||||||
|
|
||||||
|
blogPosts.value = response.data;
|
||||||
|
totalCount.value = response.meta?.total_count || response.data.length;
|
||||||
|
} catch (err) {
|
||||||
|
error.value = err instanceof Error ? err.message : 'Failed to load blog posts';
|
||||||
|
console.error('Error fetching blog posts:', err);
|
||||||
|
} finally {
|
||||||
|
loading.value = false;
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
const onPageChange = (page: number) => {
|
||||||
|
currentPage.value = page;
|
||||||
|
// Scroll to top of the page
|
||||||
|
window.scrollTo({ top: 0, behavior: 'smooth' });
|
||||||
|
};
|
||||||
|
|
||||||
|
// Watch route query changes to fetch posts when page parameter changes
|
||||||
|
watch(() => route.query.page, (newPage) => {
|
||||||
|
const page = parseInt(newPage as string) || 1;
|
||||||
|
fetchBlogPosts(page);
|
||||||
|
}, { immediate: false });
|
||||||
|
|
||||||
|
// Lifecycle
|
||||||
|
onMounted(() => {
|
||||||
|
fetchBlogPosts(currentPage.value);
|
||||||
|
});
|
||||||
|
</script>
|
||||||
|
|
||||||
|
<style scoped>
|
||||||
|
/* Fix for pagination padding issue */
|
||||||
|
:deep(.v-pagination__list) {
|
||||||
|
padding-left: 0 !important;
|
||||||
|
}
|
||||||
|
</style>
|
||||||
232
webapp/src/views/BlogPost.vue
Normal file
232
webapp/src/views/BlogPost.vue
Normal file
@@ -0,0 +1,232 @@
|
|||||||
|
<template>
|
||||||
|
<DefaultLayout>
|
||||||
|
<!-- Loading State -->
|
||||||
|
<div v-if="loading" class="d-flex justify-center align-center" style="min-height: 50vh;">
|
||||||
|
<v-progress-circular indeterminate size="64" color="primary"></v-progress-circular>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- Error State -->
|
||||||
|
<v-container v-else-if="error" class="py-8">
|
||||||
|
<v-btn
|
||||||
|
:to="{ name: 'blog' }"
|
||||||
|
variant="text"
|
||||||
|
prepend-icon="mdi-arrow-left"
|
||||||
|
>
|
||||||
|
Back to News
|
||||||
|
</v-btn>
|
||||||
|
<v-alert type="error" class="mt-6" variant="tonal">
|
||||||
|
{{ error }}
|
||||||
|
</v-alert>
|
||||||
|
</v-container>
|
||||||
|
|
||||||
|
<!-- Blog Post Content -->
|
||||||
|
<div v-else-if="blogPost">
|
||||||
|
<!-- Header -->
|
||||||
|
<v-container class="py-8">
|
||||||
|
<div class="mb-6">
|
||||||
|
<v-btn
|
||||||
|
:to="{ name: 'blog' }"
|
||||||
|
variant="text"
|
||||||
|
size="small"
|
||||||
|
prepend-icon="mdi-arrow-left"
|
||||||
|
class="mb-4"
|
||||||
|
>
|
||||||
|
Back to News
|
||||||
|
</v-btn>
|
||||||
|
|
||||||
|
<h1 class="text-h3 text-md-h2 font-weight-bold mb-4 mt-0">
|
||||||
|
{{ blogPost.title }}
|
||||||
|
</h1>
|
||||||
|
|
||||||
|
<v-card flat class="mb-6" color="transparent">
|
||||||
|
<div class="d-flex flex-column flex-sm-row">
|
||||||
|
<v-chip
|
||||||
|
prepend-icon="mdi-account"
|
||||||
|
color="grey-darken-1"
|
||||||
|
variant="text"
|
||||||
|
size="default"
|
||||||
|
>
|
||||||
|
by Will Freeman
|
||||||
|
</v-chip>
|
||||||
|
<v-chip
|
||||||
|
prepend-icon="mdi-calendar"
|
||||||
|
color="grey-darken-1"
|
||||||
|
variant="text"
|
||||||
|
size="default"
|
||||||
|
>
|
||||||
|
{{ formatDate(blogPost.published) }}
|
||||||
|
</v-chip>
|
||||||
|
</div>
|
||||||
|
</v-card>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- Blog Content -->
|
||||||
|
<v-card v-if="blogPost.content"elevation="0" class="bg-transparent">
|
||||||
|
<v-card-text class="pa-0">
|
||||||
|
<div
|
||||||
|
class="blog-content"
|
||||||
|
v-html="blogPost.content"
|
||||||
|
></div>
|
||||||
|
</v-card-text>
|
||||||
|
</v-card>
|
||||||
|
</v-container>
|
||||||
|
</div>
|
||||||
|
</DefaultLayout>
|
||||||
|
</template>
|
||||||
|
|
||||||
|
<script setup lang="ts">
|
||||||
|
import { ref, onMounted } from 'vue';
|
||||||
|
import { useRoute } from 'vue-router';
|
||||||
|
import DefaultLayout from '@/layouts/DefaultLayout.vue';
|
||||||
|
import { blogService, type BlogPost } from '@/services/blogService';
|
||||||
|
|
||||||
|
const route = useRoute();
|
||||||
|
|
||||||
|
// Reactive state
|
||||||
|
const blogPost = ref<BlogPost | null>(null);
|
||||||
|
const loading = ref(false);
|
||||||
|
const error = ref<string | null>(null);
|
||||||
|
|
||||||
|
// Methods
|
||||||
|
const formatDate = (dateString: string) => {
|
||||||
|
const date = new Date(dateString);
|
||||||
|
return date.toLocaleDateString('en-US', {
|
||||||
|
year: 'numeric',
|
||||||
|
month: 'long',
|
||||||
|
day: 'numeric'
|
||||||
|
});
|
||||||
|
};
|
||||||
|
|
||||||
|
const fetchBlogPost = async () => {
|
||||||
|
const postId = route.params.id;
|
||||||
|
|
||||||
|
if (!postId || Array.isArray(postId)) {
|
||||||
|
error.value = 'Invalid blog post ID';
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
const id = parseInt(postId, 10);
|
||||||
|
if (isNaN(id)) {
|
||||||
|
error.value = 'Invalid blog post ID';
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
loading.value = true;
|
||||||
|
error.value = null;
|
||||||
|
|
||||||
|
try {
|
||||||
|
blogPost.value = await blogService.getBlogPost(id);
|
||||||
|
} catch (err) {
|
||||||
|
error.value = err instanceof Error ? err.message : 'Failed to load blog post';
|
||||||
|
console.error('Error fetching blog post:', err);
|
||||||
|
} finally {
|
||||||
|
loading.value = false;
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
// Lifecycle
|
||||||
|
onMounted(() => {
|
||||||
|
fetchBlogPost();
|
||||||
|
});
|
||||||
|
</script>
|
||||||
|
|
||||||
|
<style scoped>
|
||||||
|
.blog-content {
|
||||||
|
font-size: 1.1rem;
|
||||||
|
line-height: 1.7;
|
||||||
|
}
|
||||||
|
|
||||||
|
.blog-content :deep(h1),
|
||||||
|
.blog-content :deep(h2),
|
||||||
|
.blog-content :deep(h3),
|
||||||
|
.blog-content :deep(h4),
|
||||||
|
.blog-content :deep(h5),
|
||||||
|
.blog-content :deep(h6) {
|
||||||
|
margin-top: 2rem;
|
||||||
|
margin-bottom: 1rem;
|
||||||
|
font-weight: 600;
|
||||||
|
}
|
||||||
|
|
||||||
|
.blog-content :deep(h1) { font-size: 2.5rem; }
|
||||||
|
.blog-content :deep(h2) { font-size: 2rem; }
|
||||||
|
.blog-content :deep(h3) { font-size: 1.75rem; }
|
||||||
|
.blog-content :deep(h4) { font-size: 1.5rem; }
|
||||||
|
.blog-content :deep(h5) { font-size: 1.25rem; }
|
||||||
|
.blog-content :deep(h6) { font-size: 1.1rem; }
|
||||||
|
|
||||||
|
.blog-content :deep(p) {
|
||||||
|
margin-bottom: 1.5rem;
|
||||||
|
}
|
||||||
|
|
||||||
|
.blog-content :deep(ul),
|
||||||
|
.blog-content :deep(ol) {
|
||||||
|
margin-bottom: 1.5rem;
|
||||||
|
padding-left: 2rem;
|
||||||
|
}
|
||||||
|
|
||||||
|
.blog-content :deep(li) {
|
||||||
|
margin-bottom: 0.5rem;
|
||||||
|
}
|
||||||
|
|
||||||
|
.blog-content :deep(blockquote) {
|
||||||
|
margin: 2rem 0;
|
||||||
|
padding: 1rem 1.5rem;
|
||||||
|
border-left: 4px solid rgb(var(--v-theme-primary));
|
||||||
|
background-color: rgba(var(--v-theme-surface-variant), 0.1);
|
||||||
|
font-style: italic;
|
||||||
|
}
|
||||||
|
|
||||||
|
.blog-content :deep(code) {
|
||||||
|
background-color: rgba(var(--v-theme-surface-variant), 0.2);
|
||||||
|
padding: 0.2rem 0.4rem;
|
||||||
|
border-radius: 4px;
|
||||||
|
font-family: 'Courier New', monospace;
|
||||||
|
}
|
||||||
|
|
||||||
|
.blog-content :deep(pre) {
|
||||||
|
background-color: rgba(var(--v-theme-surface-variant), 0.1);
|
||||||
|
padding: 1rem;
|
||||||
|
border-radius: 8px;
|
||||||
|
overflow-x: auto;
|
||||||
|
margin: 1.5rem 0;
|
||||||
|
}
|
||||||
|
|
||||||
|
.blog-content :deep(pre code) {
|
||||||
|
background-color: transparent;
|
||||||
|
padding: 0;
|
||||||
|
}
|
||||||
|
|
||||||
|
.blog-content :deep(a) {
|
||||||
|
color: rgb(var(--v-theme-primary));
|
||||||
|
text-decoration: none;
|
||||||
|
}
|
||||||
|
|
||||||
|
.blog-content :deep(a:hover) {
|
||||||
|
text-decoration: underline;
|
||||||
|
}
|
||||||
|
|
||||||
|
.blog-content :deep(img) {
|
||||||
|
max-width: 100%;
|
||||||
|
height: auto;
|
||||||
|
border-radius: 8px;
|
||||||
|
margin: 1.5rem 0;
|
||||||
|
}
|
||||||
|
|
||||||
|
.blog-content :deep(table) {
|
||||||
|
width: 100%;
|
||||||
|
border-collapse: collapse;
|
||||||
|
margin: 1.5rem 0;
|
||||||
|
}
|
||||||
|
|
||||||
|
.blog-content :deep(th),
|
||||||
|
.blog-content :deep(td) {
|
||||||
|
border: 1px solid rgba(var(--v-theme-outline), 0.2);
|
||||||
|
padding: 0.75rem;
|
||||||
|
text-align: left;
|
||||||
|
}
|
||||||
|
|
||||||
|
.blog-content :deep(th) {
|
||||||
|
background-color: rgba(var(--v-theme-surface-variant), 0.1);
|
||||||
|
font-weight: 600;
|
||||||
|
}
|
||||||
|
</style>
|
||||||
@@ -2,7 +2,7 @@
|
|||||||
<DefaultLayout>
|
<DefaultLayout>
|
||||||
<template #header>
|
<template #header>
|
||||||
<Hero
|
<Hero
|
||||||
title="DeFlock Store"
|
title="DeFlock Store (Coming Soon)"
|
||||||
description="Full store coming soon! In the meantime, check out our free Downloads."
|
description="Full store coming soon! In the meantime, check out our free Downloads."
|
||||||
gradient="linear-gradient(135deg, rgb(var(--v-theme-primary)) 0%, rgb(var(--v-theme-secondary)) 100%)"
|
gradient="linear-gradient(135deg, rgb(var(--v-theme-primary)) 0%, rgb(var(--v-theme-secondary)) 100%)"
|
||||||
/>
|
/>
|
||||||
|
|||||||
Reference in New Issue
Block a user