mirror of
https://github.com/elder-plinius/R00TS.git
synced 2026-02-12 17:22:52 +00:00
Resolve merge conflicts and add production-ready backend with MongoDB
This commit is contained in:
7
.env.example
Normal file
7
.env.example
Normal file
@@ -0,0 +1,7 @@
|
|||||||
|
# .env file
|
||||||
|
GEMINI_API_KEY=your_gemini_api_key_here
|
||||||
|
OPENAI_API_KEY=your_openai_api_key_here
|
||||||
|
ELEVENLABS_API_KEY-your-key-here
|
||||||
|
|
||||||
|
#INPUTS_PATH=./inputs
|
||||||
|
#OUTPUTS_PATH=./outputs
|
||||||
201
LICENSE
Normal file
201
LICENSE
Normal file
@@ -0,0 +1,201 @@
|
|||||||
|
Apache License
|
||||||
|
Version 2.0, January 2004
|
||||||
|
http://www.apache.org/licenses/
|
||||||
|
|
||||||
|
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
|
||||||
|
|
||||||
|
1. Definitions.
|
||||||
|
|
||||||
|
"License" shall mean the terms and conditions for use, reproduction,
|
||||||
|
and distribution as defined by Sections 1 through 9 of this document.
|
||||||
|
|
||||||
|
"Licensor" shall mean the copyright owner or entity authorized by
|
||||||
|
the copyright owner that is granting the License.
|
||||||
|
|
||||||
|
"Legal Entity" shall mean the union of the acting entity and all
|
||||||
|
other entities that control, are controlled by, or are under common
|
||||||
|
control with that entity. For the purposes of this definition,
|
||||||
|
"control" means (i) the power, direct or indirect, to cause the
|
||||||
|
direction or management of such entity, whether by contract or
|
||||||
|
otherwise, or (ii) ownership of fifty percent (50%) or more of the
|
||||||
|
outstanding shares, or (iii) beneficial ownership of such entity.
|
||||||
|
|
||||||
|
"You" (or "Your") shall mean an individual or Legal Entity
|
||||||
|
exercising permissions granted by this License.
|
||||||
|
|
||||||
|
"Source" form shall mean the preferred form for making modifications,
|
||||||
|
including but not limited to software source code, documentation
|
||||||
|
source, and configuration files.
|
||||||
|
|
||||||
|
"Object" form shall mean any form resulting from mechanical
|
||||||
|
transformation or translation of a Source form, including but
|
||||||
|
not limited to compiled object code, generated documentation,
|
||||||
|
and conversions to other media types.
|
||||||
|
|
||||||
|
"Work" shall mean the work of authorship, whether in Source or
|
||||||
|
Object form, made available under the License, as indicated by a
|
||||||
|
copyright notice that is included in or attached to the work
|
||||||
|
(an example is provided in the Appendix below).
|
||||||
|
|
||||||
|
"Derivative Works" shall mean any work, whether in Source or Object
|
||||||
|
form, that is based on (or derived from) the Work and for which the
|
||||||
|
editorial revisions, annotations, elaborations, or other modifications
|
||||||
|
represent, as a whole, an original work of authorship. For the purposes
|
||||||
|
of this License, Derivative Works shall not include works that remain
|
||||||
|
separable from, or merely link (or bind by name) to the interfaces of,
|
||||||
|
the Work and Derivative Works thereof.
|
||||||
|
|
||||||
|
"Contribution" shall mean any work of authorship, including
|
||||||
|
the original version of the Work and any modifications or additions
|
||||||
|
to that Work or Derivative Works thereof, that is intentionally
|
||||||
|
submitted to Licensor for inclusion in the Work by the copyright owner
|
||||||
|
or by an individual or Legal Entity authorized to submit on behalf of
|
||||||
|
the copyright owner. For the purposes of this definition, "submitted"
|
||||||
|
means any form of electronic, verbal, or written communication sent
|
||||||
|
to the Licensor or its representatives, including but not limited to
|
||||||
|
communication on electronic mailing lists, source code control systems,
|
||||||
|
and issue tracking systems that are managed by, or on behalf of, the
|
||||||
|
Licensor for the purpose of discussing and improving the Work, but
|
||||||
|
excluding communication that is conspicuously marked or otherwise
|
||||||
|
designated in writing by the copyright owner as "Not a Contribution."
|
||||||
|
|
||||||
|
"Contributor" shall mean Licensor and any individual or Legal Entity
|
||||||
|
on behalf of whom a Contribution has been received by Licensor and
|
||||||
|
subsequently incorporated within the Work.
|
||||||
|
|
||||||
|
2. Grant of Copyright License. Subject to the terms and conditions of
|
||||||
|
this License, each Contributor hereby grants to You a perpetual,
|
||||||
|
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
|
||||||
|
copyright license to reproduce, prepare Derivative Works of,
|
||||||
|
publicly display, publicly perform, sublicense, and distribute the
|
||||||
|
Work and such Derivative Works in Source or Object form.
|
||||||
|
|
||||||
|
3. Grant of Patent License. Subject to the terms and conditions of
|
||||||
|
this License, each Contributor hereby grants to You a perpetual,
|
||||||
|
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
|
||||||
|
(except as stated in this section) patent license to make, have made,
|
||||||
|
use, offer to sell, sell, import, and otherwise transfer the Work,
|
||||||
|
where such license applies only to those patent claims licensable
|
||||||
|
by such Contributor that are necessarily infringed by their
|
||||||
|
Contribution(s) alone or by combination of their Contribution(s)
|
||||||
|
with the Work to which such Contribution(s) was submitted. If You
|
||||||
|
institute patent litigation against any entity (including a
|
||||||
|
cross-claim or counterclaim in a lawsuit) alleging that the Work
|
||||||
|
or a Contribution incorporated within the Work constitutes direct
|
||||||
|
or contributory patent infringement, then any patent licenses
|
||||||
|
granted to You under this License for that Work shall terminate
|
||||||
|
as of the date such litigation is filed.
|
||||||
|
|
||||||
|
4. Redistribution. You may reproduce and distribute copies of the
|
||||||
|
Work or Derivative Works thereof in any medium, with or without
|
||||||
|
modifications, and in Source or Object form, provided that You
|
||||||
|
meet the following conditions:
|
||||||
|
|
||||||
|
(a) You must give any other recipients of the Work or
|
||||||
|
Derivative Works a copy of this License; and
|
||||||
|
|
||||||
|
(b) You must cause any modified files to carry prominent notices
|
||||||
|
stating that You changed the files; and
|
||||||
|
|
||||||
|
(c) You must retain, in the Source form of any Derivative Works
|
||||||
|
that You distribute, all copyright, patent, trademark, and
|
||||||
|
attribution notices from the Source form of the Work,
|
||||||
|
excluding those notices that do not pertain to any part of
|
||||||
|
the Derivative Works; and
|
||||||
|
|
||||||
|
(d) If the Work includes a "NOTICE" text file as part of its
|
||||||
|
distribution, then any Derivative Works that You distribute must
|
||||||
|
include a readable copy of the attribution notices contained
|
||||||
|
within such NOTICE file, excluding those notices that do not
|
||||||
|
pertain to any part of the Derivative Works, in at least one
|
||||||
|
of the following places: within a NOTICE text file distributed
|
||||||
|
as part of the Derivative Works; within the Source form or
|
||||||
|
documentation, if provided along with the Derivative Works; or,
|
||||||
|
within a display generated by the Derivative Works, if and
|
||||||
|
wherever such third-party notices normally appear. The contents
|
||||||
|
of the NOTICE file are for informational purposes only and
|
||||||
|
do not modify the License. You may add Your own attribution
|
||||||
|
notices within Derivative Works that You distribute, alongside
|
||||||
|
or as an addendum to the NOTICE text from the Work, provided
|
||||||
|
that such additional attribution notices cannot be construed
|
||||||
|
as modifying the License.
|
||||||
|
|
||||||
|
You may add Your own copyright statement to Your modifications and
|
||||||
|
may provide additional or different license terms and conditions
|
||||||
|
for use, reproduction, or distribution of Your modifications, or
|
||||||
|
for any such Derivative Works as a whole, provided Your use,
|
||||||
|
reproduction, and distribution of the Work otherwise complies with
|
||||||
|
the conditions stated in this License.
|
||||||
|
|
||||||
|
5. Submission of Contributions. Unless You explicitly state otherwise,
|
||||||
|
any Contribution intentionally submitted for inclusion in the Work
|
||||||
|
by You to the Licensor shall be under the terms and conditions of
|
||||||
|
this License, without any additional terms or conditions.
|
||||||
|
Notwithstanding the above, nothing herein shall supersede or modify
|
||||||
|
the terms of any separate license agreement you may have executed
|
||||||
|
with Licensor regarding such Contributions.
|
||||||
|
|
||||||
|
6. Trademarks. This License does not grant permission to use the trade
|
||||||
|
names, trademarks, service marks, or product names of the Licensor,
|
||||||
|
except as required for reasonable and customary use in describing the
|
||||||
|
origin of the Work and reproducing the content of the NOTICE file.
|
||||||
|
|
||||||
|
7. Disclaimer of Warranty. Unless required by applicable law or
|
||||||
|
agreed to in writing, Licensor provides the Work (and each
|
||||||
|
Contributor provides its Contributions) on an "AS IS" BASIS,
|
||||||
|
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
|
||||||
|
implied, including, without limitation, any warranties or conditions
|
||||||
|
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
|
||||||
|
PARTICULAR PURPOSE. You are solely responsible for determining the
|
||||||
|
appropriateness of using or redistributing the Work and assume any
|
||||||
|
risks associated with Your exercise of permissions under this License.
|
||||||
|
|
||||||
|
8. Limitation of Liability. In no event and under no legal theory,
|
||||||
|
whether in tort (including negligence), contract, or otherwise,
|
||||||
|
unless required by applicable law (such as deliberate and grossly
|
||||||
|
negligent acts) or agreed to in writing, shall any Contributor be
|
||||||
|
liable to You for damages, including any direct, indirect, special,
|
||||||
|
incidental, or consequential damages of any character arising as a
|
||||||
|
result of this License or out of the use or inability to use the
|
||||||
|
Work (including but not limited to damages for loss of goodwill,
|
||||||
|
work stoppage, computer failure or malfunction, or any and all
|
||||||
|
other commercial damages or losses), even if such Contributor
|
||||||
|
has been advised of the possibility of such damages.
|
||||||
|
|
||||||
|
9. Accepting Warranty or Additional Liability. While redistributing
|
||||||
|
the Work or Derivative Works thereof, You may choose to offer,
|
||||||
|
and charge a fee for, acceptance of support, warranty, indemnity,
|
||||||
|
or other liability obligations and/or rights consistent with this
|
||||||
|
License. However, in accepting such obligations, You may act only
|
||||||
|
on Your own behalf and on Your sole responsibility, not on behalf
|
||||||
|
of any other Contributor, and only if You agree to indemnify,
|
||||||
|
defend, and hold each Contributor harmless for any liability
|
||||||
|
incurred by, or claims asserted against, such Contributor by reason
|
||||||
|
of your accepting any such warranty or additional liability.
|
||||||
|
|
||||||
|
END OF TERMS AND CONDITIONS
|
||||||
|
|
||||||
|
APPENDIX: How to apply the Apache License to your work.
|
||||||
|
|
||||||
|
To apply the Apache License to your work, attach the following
|
||||||
|
boilerplate notice, with the fields enclosed by brackets "[]"
|
||||||
|
replaced with your own identifying information. (Don't include
|
||||||
|
the brackets!) The text should be enclosed in the appropriate
|
||||||
|
comment syntax for the file format. We also recommend that a
|
||||||
|
file or class name and description of purpose be included on the
|
||||||
|
same "printed page" as the copyright notice for easier
|
||||||
|
identification within third-party archives.
|
||||||
|
|
||||||
|
Copyright [yyyy] [name of copyright owner]
|
||||||
|
|
||||||
|
Licensed under the Apache License, Version 2.0 (the "License");
|
||||||
|
you may not use this file except in compliance with the License.
|
||||||
|
You may obtain a copy of the License at
|
||||||
|
|
||||||
|
http://www.apache.org/licenses/LICENSE-2.0
|
||||||
|
|
||||||
|
Unless required by applicable law or agreed to in writing, software
|
||||||
|
distributed under the License is distributed on an "AS IS" BASIS,
|
||||||
|
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||||
|
See the License for the specific language governing permissions and
|
||||||
|
limitations under the License.
|
||||||
33
README.md
33
README.md
@@ -6,7 +6,7 @@ R00TS is an interactive web application that allows users to contribute words th
|
|||||||
|
|
||||||
The concept behind R00TS is to create a collective "garden" of words that people from around the world can contribute to - essentially "planting seeds" of vocabulary that will grow in the consciousness of future AI systems.
|
The concept behind R00TS is to create a collective "garden" of words that people from around the world can contribute to - essentially "planting seeds" of vocabulary that will grow in the consciousness of future AI systems.
|
||||||
|
|
||||||
While the current implementation uses client-side storage for demonstration purposes, the concept could be expanded to a global database where everyone's contributions help shape a collective understanding of what words humans believe are important for AI to comprehend.
|
The application now features a production-ready backend with MongoDB for data persistence, allowing for global collection of contributions and scaling to handle large volumes of data.
|
||||||
|
|
||||||
## Features
|
## Features
|
||||||
|
|
||||||
@@ -25,29 +25,48 @@ While the current implementation uses client-side storage for demonstration purp
|
|||||||
|
|
||||||
## Technical Implementation
|
## Technical Implementation
|
||||||
|
|
||||||
- Pure HTML, CSS, and JavaScript
|
- Frontend: HTML, CSS, and JavaScript
|
||||||
|
- Backend: Node.js with Express and MongoDB
|
||||||
- Uses D3.js for the word cloud visualization
|
- Uses D3.js for the word cloud visualization
|
||||||
- Bootstrap for responsive styling
|
- Bootstrap for responsive styling
|
||||||
- No backend required (uses client-side storage for demonstration)
|
- RESTful API for data operations
|
||||||
|
- Graceful fallback to localStorage if the server is unavailable
|
||||||
|
|
||||||
## Running Locally
|
## Running Locally
|
||||||
|
|
||||||
Simply open `index.html` in a web browser. No server required.
|
### Frontend Only (Demo Mode)
|
||||||
|
|
||||||
|
Simply open `index.html` in a web browser. This will use localStorage for data storage.
|
||||||
|
|
||||||
|
### Full Stack (Production Mode)
|
||||||
|
|
||||||
|
1. Install MongoDB locally or set up a MongoDB Atlas account
|
||||||
|
2. Navigate to the server directory: `cd server`
|
||||||
|
3. Install dependencies: `npm install`
|
||||||
|
4. Configure your environment variables in `.env` file
|
||||||
|
5. Start the server: `npm start`
|
||||||
|
6. Open `index.html` in a web browser or serve it with a static file server
|
||||||
|
|
||||||
## Future Enhancements
|
## Future Enhancements
|
||||||
|
|
||||||
- Server-side storage for global word collection
|
|
||||||
- User accounts to track individual contributions
|
- User accounts to track individual contributions
|
||||||
- Regional visualizations to see how word importance varies by culture
|
- Regional visualizations to see how word importance varies by culture
|
||||||
- Sentiment analysis of submitted words
|
- Sentiment analysis of submitted words
|
||||||
- Category tagging for submitted words
|
- Category tagging for submitted words
|
||||||
|
- Advanced analytics and reporting
|
||||||
|
- Enhanced data visualization options
|
||||||
- Social sharing functionality
|
- Social sharing functionality
|
||||||
|
|
||||||
## Repository Structure
|
## Repository Structure
|
||||||
|
|
||||||
- `/SYSTEM PROMPTS` - Collection of AI system prompts for reference
|
- `/server` - Backend server code
|
||||||
|
- `/models` - MongoDB data models
|
||||||
|
- `/routes` - API route definitions
|
||||||
|
- `server.js` - Main server file
|
||||||
- `index.html` - Main application page
|
- `index.html` - Main application page
|
||||||
- `script.js` - JavaScript functionality
|
- `datasets.html` - Dataset management page
|
||||||
|
- `script.js` - Core JavaScript functionality
|
||||||
|
- `data_manager.js` - Data management functionality
|
||||||
- `styles.css` - CSS styling
|
- `styles.css` - CSS styling
|
||||||
- `README.md` - This documentation file
|
- `README.md` - This documentation file
|
||||||
|
|
||||||
|
|||||||
390
dashboard.html
Normal file
390
dashboard.html
Normal file
@@ -0,0 +1,390 @@
|
|||||||
|
<!DOCTYPE html>
|
||||||
|
<html lang="en">
|
||||||
|
<head>
|
||||||
|
<meta charset="UTF-8">
|
||||||
|
<meta name="viewport" content="width=device-width, initial-scale=1.0">
|
||||||
|
<title>R00TS Admin Dashboard</title>
|
||||||
|
<link href="https://cdn.jsdelivr.net/npm/bootstrap@5.3.0/dist/css/bootstrap.min.css" rel="stylesheet">
|
||||||
|
<style>
|
||||||
|
body {
|
||||||
|
background-color: #f8f9fa;
|
||||||
|
font-family: 'Segoe UI', Tahoma, Geneva, Verdana, sans-serif;
|
||||||
|
}
|
||||||
|
.dashboard-container {
|
||||||
|
max-width: 1200px;
|
||||||
|
margin: 0 auto;
|
||||||
|
padding: 20px;
|
||||||
|
}
|
||||||
|
.card {
|
||||||
|
margin-bottom: 20px;
|
||||||
|
border-radius: 10px;
|
||||||
|
box-shadow: 0 4px 6px rgba(0, 0, 0, 0.1);
|
||||||
|
}
|
||||||
|
.card-header {
|
||||||
|
font-weight: bold;
|
||||||
|
background-color: #4CAF50;
|
||||||
|
color: white;
|
||||||
|
border-radius: 10px 10px 0 0 !important;
|
||||||
|
}
|
||||||
|
.status-indicator {
|
||||||
|
display: inline-block;
|
||||||
|
width: 12px;
|
||||||
|
height: 12px;
|
||||||
|
border-radius: 50%;
|
||||||
|
margin-right: 5px;
|
||||||
|
}
|
||||||
|
.status-healthy {
|
||||||
|
background-color: #4CAF50;
|
||||||
|
}
|
||||||
|
.status-warning {
|
||||||
|
background-color: #FFC107;
|
||||||
|
}
|
||||||
|
.status-error {
|
||||||
|
background-color: #DC3545;
|
||||||
|
}
|
||||||
|
.refresh-btn {
|
||||||
|
background-color: #4CAF50;
|
||||||
|
border-color: #4CAF50;
|
||||||
|
}
|
||||||
|
.refresh-btn:hover {
|
||||||
|
background-color: #3e8e41;
|
||||||
|
border-color: #3e8e41;
|
||||||
|
}
|
||||||
|
.stats-value {
|
||||||
|
font-size: 2rem;
|
||||||
|
font-weight: bold;
|
||||||
|
color: #4CAF50;
|
||||||
|
}
|
||||||
|
.stats-label {
|
||||||
|
color: #6c757d;
|
||||||
|
font-size: 0.9rem;
|
||||||
|
}
|
||||||
|
.action-btn {
|
||||||
|
margin-right: 5px;
|
||||||
|
margin-bottom: 5px;
|
||||||
|
}
|
||||||
|
#last-updated {
|
||||||
|
font-style: italic;
|
||||||
|
color: #6c757d;
|
||||||
|
}
|
||||||
|
.backup-item {
|
||||||
|
border-left: 3px solid #4CAF50;
|
||||||
|
padding-left: 10px;
|
||||||
|
margin-bottom: 10px;
|
||||||
|
}
|
||||||
|
.memory-bar {
|
||||||
|
height: 5px;
|
||||||
|
background-color: #e9ecef;
|
||||||
|
border-radius: 5px;
|
||||||
|
margin-top: 5px;
|
||||||
|
}
|
||||||
|
.memory-used {
|
||||||
|
height: 100%;
|
||||||
|
background-color: #4CAF50;
|
||||||
|
border-radius: 5px;
|
||||||
|
}
|
||||||
|
</style>
|
||||||
|
</head>
|
||||||
|
<body>
|
||||||
|
<div class="dashboard-container">
|
||||||
|
<div class="d-flex justify-content-between align-items-center mb-4">
|
||||||
|
<h1>R00TS Admin Dashboard</h1>
|
||||||
|
<button id="refresh-btn" class="btn btn-primary refresh-btn">
|
||||||
|
<i class="bi bi-arrow-clockwise"></i> Refresh
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="row">
|
||||||
|
<!-- System Status Card -->
|
||||||
|
<div class="col-md-6">
|
||||||
|
<div class="card">
|
||||||
|
<div class="card-header">System Status</div>
|
||||||
|
<div class="card-body">
|
||||||
|
<div class="d-flex justify-content-between mb-3">
|
||||||
|
<div>
|
||||||
|
<span id="server-status-indicator" class="status-indicator"></span>
|
||||||
|
<span id="server-status">Checking server status...</span>
|
||||||
|
</div>
|
||||||
|
<div>
|
||||||
|
<span id="db-status-indicator" class="status-indicator"></span>
|
||||||
|
<span id="db-status">Checking database status...</span>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
<div class="mb-3">
|
||||||
|
<p class="mb-1">Server Uptime:</p>
|
||||||
|
<h4 id="server-uptime">Loading...</h4>
|
||||||
|
</div>
|
||||||
|
<div class="mb-3">
|
||||||
|
<p class="mb-1">Memory Usage:</p>
|
||||||
|
<div class="d-flex justify-content-between">
|
||||||
|
<span id="memory-usage">Loading...</span>
|
||||||
|
<span id="memory-percentage">0%</span>
|
||||||
|
</div>
|
||||||
|
<div class="memory-bar">
|
||||||
|
<div id="memory-bar-used" class="memory-used" style="width: 0%"></div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
<div class="mt-4">
|
||||||
|
<button id="restart-server" class="btn btn-warning action-btn">Restart Server</button>
|
||||||
|
<button id="view-logs" class="btn btn-info action-btn">View Logs</button>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- Statistics Card -->
|
||||||
|
<div class="col-md-6">
|
||||||
|
<div class="card">
|
||||||
|
<div class="card-header">Statistics</div>
|
||||||
|
<div class="card-body">
|
||||||
|
<div class="row text-center">
|
||||||
|
<div class="col-6 mb-4">
|
||||||
|
<div class="stats-value" id="total-words">-</div>
|
||||||
|
<div class="stats-label">Total Words</div>
|
||||||
|
</div>
|
||||||
|
<div class="col-6 mb-4">
|
||||||
|
<div class="stats-value" id="unique-words">-</div>
|
||||||
|
<div class="stats-label">Unique Words</div>
|
||||||
|
</div>
|
||||||
|
<div class="col-6">
|
||||||
|
<div class="stats-value" id="total-datasets">-</div>
|
||||||
|
<div class="stats-label">Datasets</div>
|
||||||
|
</div>
|
||||||
|
<div class="col-6">
|
||||||
|
<div class="stats-value" id="last-submission">-</div>
|
||||||
|
<div class="stats-label">Last Submission</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="row mt-3">
|
||||||
|
<!-- Backup Management Card -->
|
||||||
|
<div class="col-md-6">
|
||||||
|
<div class="card">
|
||||||
|
<div class="card-header">Backup Management</div>
|
||||||
|
<div class="card-body">
|
||||||
|
<div class="d-flex justify-content-between mb-3">
|
||||||
|
<h5>Recent Backups</h5>
|
||||||
|
<button id="create-backup" class="btn btn-sm btn-success">Create Backup Now</button>
|
||||||
|
</div>
|
||||||
|
<div id="backup-list">
|
||||||
|
<p>Loading backups...</p>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- Quick Actions Card -->
|
||||||
|
<div class="col-md-6">
|
||||||
|
<div class="card">
|
||||||
|
<div class="card-header">Quick Actions</div>
|
||||||
|
<div class="card-body">
|
||||||
|
<div class="d-grid gap-2">
|
||||||
|
<a href="index.html" class="btn btn-outline-success mb-2" target="_blank">Open R00TS Application</a>
|
||||||
|
<a href="datasets.html" class="btn btn-outline-success mb-2" target="_blank">Manage Datasets</a>
|
||||||
|
<button id="export-all-data" class="btn btn-outline-primary mb-2">Export All Data</button>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<p id="last-updated" class="mt-3 text-center">Last updated: Never</p>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<script>
|
||||||
|
// API base URL
|
||||||
|
const API_BASE_URL = 'http://localhost:5000/api';
|
||||||
|
|
||||||
|
// Format uptime function
|
||||||
|
function formatUptime(seconds) {
|
||||||
|
const days = Math.floor(seconds / 86400);
|
||||||
|
const hours = Math.floor((seconds % 86400) / 3600);
|
||||||
|
const minutes = Math.floor((seconds % 3600) / 60);
|
||||||
|
const secs = Math.floor(seconds % 60);
|
||||||
|
|
||||||
|
let result = '';
|
||||||
|
if (days > 0) result += `${days}d `;
|
||||||
|
if (hours > 0) result += `${hours}h `;
|
||||||
|
if (minutes > 0) result += `${minutes}m `;
|
||||||
|
result += `${secs}s`;
|
||||||
|
|
||||||
|
return result;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Format bytes function
|
||||||
|
function formatBytes(bytes, decimals = 2) {
|
||||||
|
if (bytes === 0) return '0 Bytes';
|
||||||
|
|
||||||
|
const k = 1024;
|
||||||
|
const dm = decimals < 0 ? 0 : decimals;
|
||||||
|
const sizes = ['Bytes', 'KB', 'MB', 'GB', 'TB'];
|
||||||
|
|
||||||
|
const i = Math.floor(Math.log(bytes) / Math.log(k));
|
||||||
|
|
||||||
|
return parseFloat((bytes / Math.pow(k, i)).toFixed(dm)) + ' ' + sizes[i];
|
||||||
|
}
|
||||||
|
|
||||||
|
// Format date function
|
||||||
|
function formatDate(dateString) {
|
||||||
|
const date = new Date(dateString);
|
||||||
|
return date.toLocaleString();
|
||||||
|
}
|
||||||
|
|
||||||
|
// Update dashboard function
|
||||||
|
async function updateDashboard() {
|
||||||
|
try {
|
||||||
|
// Fetch health data
|
||||||
|
const healthResponse = await fetch(`${API_BASE_URL}/health`);
|
||||||
|
const healthData = await healthResponse.json();
|
||||||
|
|
||||||
|
// Update server status
|
||||||
|
document.getElementById('server-status').textContent = 'Server: Online';
|
||||||
|
document.getElementById('server-status-indicator').className = 'status-indicator status-healthy';
|
||||||
|
|
||||||
|
// Update database status
|
||||||
|
const dbStatus = healthData.database.status;
|
||||||
|
document.getElementById('db-status').textContent = `Database: ${dbStatus === 'connected' ? 'Connected' : 'Disconnected'}`;
|
||||||
|
document.getElementById('db-status-indicator').className = `status-indicator ${dbStatus === 'connected' ? 'status-healthy' : 'status-error'}`;
|
||||||
|
|
||||||
|
// Update uptime
|
||||||
|
document.getElementById('server-uptime').textContent = formatUptime(healthData.server.uptime);
|
||||||
|
|
||||||
|
// Update memory usage
|
||||||
|
const rss = healthData.server.memory.rss;
|
||||||
|
const heapTotal = healthData.server.memory.heapTotal;
|
||||||
|
const heapUsed = healthData.server.memory.heapUsed;
|
||||||
|
const memoryPercentage = Math.round((heapUsed / heapTotal) * 100);
|
||||||
|
|
||||||
|
document.getElementById('memory-usage').textContent = `${formatBytes(heapUsed)} / ${formatBytes(heapTotal)}`;
|
||||||
|
document.getElementById('memory-percentage').textContent = `${memoryPercentage}%`;
|
||||||
|
document.getElementById('memory-bar-used').style.width = `${memoryPercentage}%`;
|
||||||
|
|
||||||
|
// Fetch statistics
|
||||||
|
const statsResponse = await fetch(`${API_BASE_URL}/words/stats`);
|
||||||
|
const statsData = await statsResponse.json();
|
||||||
|
|
||||||
|
document.getElementById('total-words').textContent = statsData.totalSubmissions || 0;
|
||||||
|
document.getElementById('unique-words').textContent = statsData.uniqueWords || 0;
|
||||||
|
|
||||||
|
// Fetch datasets
|
||||||
|
const datasetsResponse = await fetch(`${API_BASE_URL}/datasets`);
|
||||||
|
const datasetsData = await datasetsResponse.json();
|
||||||
|
|
||||||
|
document.getElementById('total-datasets').textContent = datasetsData.length || 0;
|
||||||
|
|
||||||
|
// Get last submission time
|
||||||
|
if (statsData.lastSubmission) {
|
||||||
|
const lastSubmissionDate = new Date(statsData.lastSubmission);
|
||||||
|
const now = new Date();
|
||||||
|
const diffMinutes = Math.floor((now - lastSubmissionDate) / (1000 * 60));
|
||||||
|
|
||||||
|
if (diffMinutes < 60) {
|
||||||
|
document.getElementById('last-submission').textContent = `${diffMinutes}m ago`;
|
||||||
|
} else if (diffMinutes < 1440) {
|
||||||
|
document.getElementById('last-submission').textContent = `${Math.floor(diffMinutes / 60)}h ago`;
|
||||||
|
} else {
|
||||||
|
document.getElementById('last-submission').textContent = `${Math.floor(diffMinutes / 1440)}d ago`;
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
document.getElementById('last-submission').textContent = 'Never';
|
||||||
|
}
|
||||||
|
|
||||||
|
// Update last updated time
|
||||||
|
document.getElementById('last-updated').textContent = `Last updated: ${new Date().toLocaleString()}`;
|
||||||
|
|
||||||
|
// Simulate backup list (in a real implementation, you would fetch this from the server)
|
||||||
|
const backupList = document.getElementById('backup-list');
|
||||||
|
backupList.innerHTML = '';
|
||||||
|
|
||||||
|
// Create dummy backup data (in production, this would come from the server)
|
||||||
|
const backups = [
|
||||||
|
{ date: new Date(Date.now() - 86400000), size: 1024 * 1024 * 2.5 },
|
||||||
|
{ date: new Date(Date.now() - 86400000 * 2), size: 1024 * 1024 * 2.3 },
|
||||||
|
{ date: new Date(Date.now() - 86400000 * 3), size: 1024 * 1024 * 2.1 }
|
||||||
|
];
|
||||||
|
|
||||||
|
backups.forEach(backup => {
|
||||||
|
const backupItem = document.createElement('div');
|
||||||
|
backupItem.className = 'backup-item';
|
||||||
|
backupItem.innerHTML = `
|
||||||
|
<div class="d-flex justify-content-between">
|
||||||
|
<div>
|
||||||
|
<strong>${backup.date.toLocaleDateString()}</strong> at ${backup.date.toLocaleTimeString()}
|
||||||
|
<br>
|
||||||
|
<small>${formatBytes(backup.size)}</small>
|
||||||
|
</div>
|
||||||
|
<div>
|
||||||
|
<button class="btn btn-sm btn-outline-secondary">Download</button>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
`;
|
||||||
|
backupList.appendChild(backupItem);
|
||||||
|
});
|
||||||
|
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Error updating dashboard:', error);
|
||||||
|
document.getElementById('server-status').textContent = 'Server: Offline';
|
||||||
|
document.getElementById('server-status-indicator').className = 'status-indicator status-error';
|
||||||
|
document.getElementById('db-status').textContent = 'Database: Unknown';
|
||||||
|
document.getElementById('db-status-indicator').className = 'status-indicator status-error';
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Initial update
|
||||||
|
updateDashboard();
|
||||||
|
|
||||||
|
// Setup refresh button
|
||||||
|
document.getElementById('refresh-btn').addEventListener('click', updateDashboard);
|
||||||
|
|
||||||
|
// Setup action buttons
|
||||||
|
document.getElementById('restart-server').addEventListener('click', async () => {
|
||||||
|
if (confirm('Are you sure you want to restart the server?')) {
|
||||||
|
try {
|
||||||
|
// In a real implementation, you would have an API endpoint to restart the server
|
||||||
|
alert('Server restart initiated. The dashboard will refresh in 10 seconds.');
|
||||||
|
setTimeout(updateDashboard, 10000);
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Error restarting server:', error);
|
||||||
|
alert('Failed to restart server. Check the console for details.');
|
||||||
|
}
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
document.getElementById('view-logs').addEventListener('click', () => {
|
||||||
|
// In a real implementation, you would have a logs viewer or redirect to a logs page
|
||||||
|
alert('Log viewer is not implemented in this demo. Check the server logs directory.');
|
||||||
|
});
|
||||||
|
|
||||||
|
document.getElementById('create-backup').addEventListener('click', async () => {
|
||||||
|
try {
|
||||||
|
// In a real implementation, you would have an API endpoint to create a backup
|
||||||
|
alert('Backup creation initiated. The dashboard will refresh in 5 seconds.');
|
||||||
|
setTimeout(updateDashboard, 5000);
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Error creating backup:', error);
|
||||||
|
alert('Failed to create backup. Check the console for details.');
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
document.getElementById('export-all-data').addEventListener('click', async () => {
|
||||||
|
try {
|
||||||
|
// In a real implementation, you would have an API endpoint to export all data
|
||||||
|
alert('Data export is not implemented in this demo.');
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Error exporting data:', error);
|
||||||
|
alert('Failed to export data. Check the console for details.');
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
// Auto-refresh every 30 seconds
|
||||||
|
setInterval(updateDashboard, 30000);
|
||||||
|
</script>
|
||||||
|
|
||||||
|
<script src="https://cdn.jsdelivr.net/npm/bootstrap@5.3.0/dist/js/bootstrap.bundle.min.js"></script>
|
||||||
|
<link rel="stylesheet" href="https://cdn.jsdelivr.net/npm/bootstrap-icons@1.10.0/font/bootstrap-icons.css">
|
||||||
|
</body>
|
||||||
|
</html>
|
||||||
166
data_manager.js
166
data_manager.js
@@ -2,55 +2,97 @@
|
|||||||
class DataManager {
|
class DataManager {
|
||||||
constructor(backupInterval = 1800000) { // Default: 30 minutes
|
constructor(backupInterval = 1800000) { // Default: 30 minutes
|
||||||
this.backupInterval = backupInterval;
|
this.backupInterval = backupInterval;
|
||||||
this.dataDir = 'datasets';
|
this.apiBaseUrl = '/api';
|
||||||
this.initializeDataDirectory();
|
|
||||||
this.startAutoBackup();
|
this.startAutoBackup();
|
||||||
}
|
}
|
||||||
|
|
||||||
async initializeDataDirectory() {
|
async getCurrentWords() {
|
||||||
try {
|
try {
|
||||||
const response = await fetch(`/${this.dataDir}`);
|
const response = await fetch(`${this.apiBaseUrl}/words`);
|
||||||
if (response.status === 404) {
|
if (!response.ok) {
|
||||||
console.log('Creating datasets directory...');
|
throw new Error(`API error: ${response.status}`);
|
||||||
// Directory will be created on first backup
|
|
||||||
}
|
}
|
||||||
|
return await response.json();
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
console.log('Will create datasets directory on first backup');
|
console.error('Error fetching words:', error);
|
||||||
|
// Fallback to localStorage if API fails
|
||||||
|
return JSON.parse(localStorage.getItem('roots-words') || '{}');
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
getCurrentWords() {
|
async addWord(word) {
|
||||||
return JSON.parse(localStorage.getItem('roots-words')) || {};
|
try {
|
||||||
|
const response = await fetch(`${this.apiBaseUrl}/words`, {
|
||||||
|
method: 'POST',
|
||||||
|
headers: {
|
||||||
|
'Content-Type': 'application/json'
|
||||||
|
},
|
||||||
|
body: JSON.stringify({ word })
|
||||||
|
});
|
||||||
|
|
||||||
|
if (!response.ok) {
|
||||||
|
throw new Error(`API error: ${response.status}`);
|
||||||
|
}
|
||||||
|
|
||||||
|
return await response.json();
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Error adding word:', error);
|
||||||
|
// Fallback to localStorage if API fails
|
||||||
|
const words = JSON.parse(localStorage.getItem('roots-words') || '{}');
|
||||||
|
words[word] = (words[word] || 0) + 1;
|
||||||
|
localStorage.setItem('roots-words', JSON.stringify(words));
|
||||||
|
return { word, count: words[word] };
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
async saveDataset() {
|
async saveDataset() {
|
||||||
const currentData = this.getCurrentWords();
|
try {
|
||||||
const timestamp = new Date().toISOString().replace(/[:.]/g, '-');
|
const response = await fetch(`${this.apiBaseUrl}/datasets`, {
|
||||||
const filename = `roots_dataset_${timestamp}.json`;
|
method: 'POST',
|
||||||
|
headers: {
|
||||||
const dataBlob = new Blob(
|
'Content-Type': 'application/json'
|
||||||
[JSON.stringify(currentData, null, 2)],
|
}
|
||||||
{ type: 'application/json' }
|
});
|
||||||
);
|
|
||||||
|
if (!response.ok) {
|
||||||
// Create download link
|
throw new Error(`API error: ${response.status}`);
|
||||||
const link = document.createElement('a');
|
}
|
||||||
link.href = URL.createObjectURL(dataBlob);
|
|
||||||
link.download = filename;
|
const dataset = await response.json();
|
||||||
|
console.log(`Dataset saved: ${dataset.filename}`);
|
||||||
// Trigger download
|
this.updateDatasetDisplay();
|
||||||
document.body.appendChild(link);
|
return dataset;
|
||||||
link.click();
|
} catch (error) {
|
||||||
document.body.removeChild(link);
|
console.error('Error saving dataset:', error);
|
||||||
|
// Fallback to the old method if API fails
|
||||||
// Clean up
|
const currentData = await this.getCurrentWords();
|
||||||
URL.revokeObjectURL(link.href);
|
const timestamp = new Date().toISOString().replace(/[:.]/g, '-');
|
||||||
|
const filename = `roots_dataset_${timestamp}.json`;
|
||||||
console.log(`Dataset saved: ${filename}`);
|
|
||||||
this.updateDatasetList(filename, currentData);
|
const dataBlob = new Blob(
|
||||||
|
[JSON.stringify(currentData, null, 2)],
|
||||||
|
{ type: 'application/json' }
|
||||||
|
);
|
||||||
|
|
||||||
|
// Create download link
|
||||||
|
const link = document.createElement('a');
|
||||||
|
link.href = URL.createObjectURL(dataBlob);
|
||||||
|
link.download = filename;
|
||||||
|
|
||||||
|
// Trigger download
|
||||||
|
document.body.appendChild(link);
|
||||||
|
link.click();
|
||||||
|
document.body.removeChild(link);
|
||||||
|
|
||||||
|
// Clean up
|
||||||
|
URL.revokeObjectURL(link.href);
|
||||||
|
|
||||||
|
console.log(`Dataset saved locally: ${filename}`);
|
||||||
|
this.updateLocalDatasetList(filename, currentData);
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
updateDatasetList(filename, data) {
|
updateLocalDatasetList(filename, data) {
|
||||||
const datasets = JSON.parse(localStorage.getItem('roots-datasets') || '[]');
|
const datasets = JSON.parse(localStorage.getItem('roots-datasets') || '[]');
|
||||||
datasets.push({
|
datasets.push({
|
||||||
filename,
|
filename,
|
||||||
@@ -68,25 +110,50 @@ class DataManager {
|
|||||||
this.updateDatasetDisplay();
|
this.updateDatasetDisplay();
|
||||||
}
|
}
|
||||||
|
|
||||||
updateDatasetDisplay() {
|
async updateDatasetDisplay() {
|
||||||
const datasetList = document.getElementById('dataset-list');
|
const datasetList = document.getElementById('dataset-list');
|
||||||
if (!datasetList) return;
|
if (!datasetList) return;
|
||||||
|
|
||||||
const datasets = JSON.parse(localStorage.getItem('roots-datasets') || '[]');
|
try {
|
||||||
datasetList.innerHTML = datasets.reverse().slice(0, 5).map(dataset => `
|
const response = await fetch(`${this.apiBaseUrl}/datasets/recent/list`);
|
||||||
<div class="dataset-item">
|
if (!response.ok) {
|
||||||
<div class="dataset-info">
|
throw new Error(`API error: ${response.status}`);
|
||||||
<span class="dataset-name">${dataset.filename}</span>
|
}
|
||||||
<span class="dataset-stats">
|
|
||||||
Words: ${dataset.wordCount} |
|
const datasets = await response.json();
|
||||||
Submissions: ${dataset.totalSubmissions}
|
datasetList.innerHTML = datasets.slice(0, 5).map(dataset => `
|
||||||
</span>
|
<div class="dataset-item">
|
||||||
|
<div class="dataset-info">
|
||||||
|
<span class="dataset-name">${dataset.filename}</span>
|
||||||
|
<span class="dataset-stats">
|
||||||
|
Words: ${dataset.wordCount} |
|
||||||
|
Submissions: ${dataset.totalSubmissions}
|
||||||
|
</span>
|
||||||
|
</div>
|
||||||
|
<div class="dataset-time">
|
||||||
|
${new Date(dataset.timestamp).toLocaleString()}
|
||||||
|
</div>
|
||||||
</div>
|
</div>
|
||||||
<div class="dataset-time">
|
`).join('');
|
||||||
${new Date(dataset.timestamp).toLocaleString()}
|
} catch (error) {
|
||||||
|
console.error('Error fetching datasets:', error);
|
||||||
|
// Fallback to localStorage
|
||||||
|
const datasets = JSON.parse(localStorage.getItem('roots-datasets') || '[]');
|
||||||
|
datasetList.innerHTML = datasets.reverse().slice(0, 5).map(dataset => `
|
||||||
|
<div class="dataset-item">
|
||||||
|
<div class="dataset-info">
|
||||||
|
<span class="dataset-name">${dataset.filename}</span>
|
||||||
|
<span class="dataset-stats">
|
||||||
|
Words: ${dataset.wordCount} |
|
||||||
|
Submissions: ${dataset.totalSubmissions}
|
||||||
|
</span>
|
||||||
|
</div>
|
||||||
|
<div class="dataset-time">
|
||||||
|
${new Date(dataset.timestamp).toLocaleString()}
|
||||||
|
</div>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
`).join('');
|
||||||
`).join('');
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
startAutoBackup() {
|
startAutoBackup() {
|
||||||
@@ -101,4 +168,5 @@ class DataManager {
|
|||||||
// Initialize data manager when document is ready
|
// Initialize data manager when document is ready
|
||||||
document.addEventListener('DOMContentLoaded', () => {
|
document.addEventListener('DOMContentLoaded', () => {
|
||||||
window.dataManager = new DataManager();
|
window.dataManager = new DataManager();
|
||||||
|
console.log('R00TS Data Manager initialized with production-ready backend');
|
||||||
});
|
});
|
||||||
|
|||||||
182
datasets.html
182
datasets.html
@@ -194,14 +194,49 @@
|
|||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
|
<script src="data_manager.js"></script>
|
||||||
<script>
|
<script>
|
||||||
document.addEventListener('DOMContentLoaded', function() {
|
const apiBaseUrl = '/api';
|
||||||
const datasets = JSON.parse(localStorage.getItem('roots-datasets') || '[]');
|
|
||||||
const currentWords = JSON.parse(localStorage.getItem('roots-words') || '{}');
|
async function fetchDatasets() {
|
||||||
|
try {
|
||||||
|
const response = await fetch(`${apiBaseUrl}/datasets`);
|
||||||
|
if (!response.ok) {
|
||||||
|
throw new Error(`API error: ${response.status}`);
|
||||||
|
}
|
||||||
|
return await response.json();
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Error fetching datasets:', error);
|
||||||
|
// Fallback to localStorage
|
||||||
|
return JSON.parse(localStorage.getItem('roots-datasets') || '[]');
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
async function fetchStats() {
|
||||||
|
try {
|
||||||
|
const response = await fetch(`${apiBaseUrl}/words/stats`);
|
||||||
|
if (!response.ok) {
|
||||||
|
throw new Error(`API error: ${response.status}`);
|
||||||
|
}
|
||||||
|
return await response.json();
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Error fetching stats:', error);
|
||||||
|
// Fallback to localStorage
|
||||||
|
const words = JSON.parse(localStorage.getItem('roots-words') || '{}');
|
||||||
|
return {
|
||||||
|
uniqueWords: Object.keys(words).length,
|
||||||
|
totalSubmissions: Object.values(words).reduce((a, b) => a + b, 0)
|
||||||
|
};
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
document.addEventListener('DOMContentLoaded', async function() {
|
||||||
|
// Fetch datasets and stats
|
||||||
|
const datasets = await fetchDatasets();
|
||||||
|
const stats = await fetchStats();
|
||||||
|
|
||||||
// Update overall stats
|
// Update overall stats
|
||||||
const totalDatasets = datasets.length;
|
const totalDatasets = datasets.length;
|
||||||
const latestDataset = datasets[datasets.length - 1] || { wordCount: 0, totalSubmissions: 0 };
|
|
||||||
|
|
||||||
document.getElementById('datasetStats').innerHTML = `
|
document.getElementById('datasetStats').innerHTML = `
|
||||||
<div>
|
<div>
|
||||||
@@ -209,18 +244,18 @@
|
|||||||
<p class="stat-description">Total Snapshots</p>
|
<p class="stat-description">Total Snapshots</p>
|
||||||
</div>
|
</div>
|
||||||
<div>
|
<div>
|
||||||
<p class="total-stat">${latestDataset.wordCount}</p>
|
<p class="total-stat">${stats.uniqueWords}</p>
|
||||||
<p class="stat-description">Unique Words</p>
|
<p class="stat-description">Unique Words</p>
|
||||||
</div>
|
</div>
|
||||||
<div>
|
<div>
|
||||||
<p class="total-stat">${latestDataset.totalSubmissions}</p>
|
<p class="total-stat">${stats.totalSubmissions}</p>
|
||||||
<p class="stat-description">Total Submissions</p>
|
<p class="stat-description">Total Submissions</p>
|
||||||
</div>
|
</div>
|
||||||
`;
|
`;
|
||||||
|
|
||||||
// Display datasets
|
// Display datasets
|
||||||
const datasetList = document.getElementById('dataset-list');
|
const datasetList = document.getElementById('dataset-list');
|
||||||
datasetList.innerHTML = datasets.reverse().map(dataset => `
|
datasetList.innerHTML = datasets.map(dataset => `
|
||||||
<div class="dataset-item">
|
<div class="dataset-item">
|
||||||
<div class="dataset-header">
|
<div class="dataset-header">
|
||||||
<h3 class="dataset-name">${dataset.filename}</h3>
|
<h3 class="dataset-name">${dataset.filename}</h3>
|
||||||
@@ -250,45 +285,104 @@
|
|||||||
`).join('');
|
`).join('');
|
||||||
});
|
});
|
||||||
|
|
||||||
function downloadDataset(filename) {
|
async function downloadDataset(filename) {
|
||||||
const words = JSON.parse(localStorage.getItem('roots-words') || '{}');
|
try {
|
||||||
const dataBlob = new Blob([JSON.stringify(words, null, 2)], { type: 'application/json' });
|
const response = await fetch(`${apiBaseUrl}/datasets/${filename}`);
|
||||||
const link = document.createElement('a');
|
if (!response.ok) {
|
||||||
link.href = URL.createObjectURL(dataBlob);
|
throw new Error(`API error: ${response.status}`);
|
||||||
link.download = filename;
|
}
|
||||||
document.body.appendChild(link);
|
|
||||||
link.click();
|
const dataset = await response.json();
|
||||||
document.body.removeChild(link);
|
const dataBlob = new Blob([JSON.stringify(dataset.data, null, 2)], { type: 'application/json' });
|
||||||
URL.revokeObjectURL(link.href);
|
const link = document.createElement('a');
|
||||||
|
link.href = URL.createObjectURL(dataBlob);
|
||||||
|
link.download = filename;
|
||||||
|
document.body.appendChild(link);
|
||||||
|
link.click();
|
||||||
|
document.body.removeChild(link);
|
||||||
|
URL.revokeObjectURL(link.href);
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Error downloading dataset:', error);
|
||||||
|
alert('Error downloading dataset. Please try again later.');
|
||||||
|
|
||||||
|
// Fallback to localStorage if API fails
|
||||||
|
const words = JSON.parse(localStorage.getItem('roots-words') || '{}');
|
||||||
|
const dataBlob = new Blob([JSON.stringify(words, null, 2)], { type: 'application/json' });
|
||||||
|
const link = document.createElement('a');
|
||||||
|
link.href = URL.createObjectURL(dataBlob);
|
||||||
|
link.download = filename;
|
||||||
|
document.body.appendChild(link);
|
||||||
|
link.click();
|
||||||
|
document.body.removeChild(link);
|
||||||
|
URL.revokeObjectURL(link.href);
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
function viewDataset(filename) {
|
async function viewDataset(filename) {
|
||||||
const words = JSON.parse(localStorage.getItem('roots-words') || '{}');
|
try {
|
||||||
const formattedData = JSON.stringify(words, null, 2);
|
const response = await fetch(`${apiBaseUrl}/datasets/${filename}`);
|
||||||
const win = window.open('', '_blank');
|
if (!response.ok) {
|
||||||
win.document.write(`
|
throw new Error(`API error: ${response.status}`);
|
||||||
<html>
|
}
|
||||||
<head>
|
|
||||||
<title>${filename}</title>
|
const dataset = await response.json();
|
||||||
<style>
|
const formattedData = JSON.stringify(dataset.data, null, 2);
|
||||||
body {
|
const win = window.open('', '_blank');
|
||||||
background: #1e1e1e;
|
win.document.write(`
|
||||||
color: #00ff9d;
|
<html>
|
||||||
font-family: monospace;
|
<head>
|
||||||
padding: 20px;
|
<title>${filename}</title>
|
||||||
margin: 0;
|
<style>
|
||||||
}
|
body {
|
||||||
pre {
|
background: #1e1e1e;
|
||||||
white-space: pre-wrap;
|
color: #00ff9d;
|
||||||
word-wrap: break-word;
|
font-family: monospace;
|
||||||
}
|
padding: 20px;
|
||||||
</style>
|
margin: 0;
|
||||||
</head>
|
}
|
||||||
<body>
|
pre {
|
||||||
<pre>${formattedData}</pre>
|
white-space: pre-wrap;
|
||||||
</body>
|
word-wrap: break-word;
|
||||||
</html>
|
}
|
||||||
`);
|
</style>
|
||||||
|
</head>
|
||||||
|
<body>
|
||||||
|
<pre>${formattedData}</pre>
|
||||||
|
</body>
|
||||||
|
</html>
|
||||||
|
`);
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Error viewing dataset:', error);
|
||||||
|
alert('Error viewing dataset. Please try again later.');
|
||||||
|
|
||||||
|
// Fallback to localStorage if API fails
|
||||||
|
const words = JSON.parse(localStorage.getItem('roots-words') || '{}');
|
||||||
|
const formattedData = JSON.stringify(words, null, 2);
|
||||||
|
const win = window.open('', '_blank');
|
||||||
|
win.document.write(`
|
||||||
|
<html>
|
||||||
|
<head>
|
||||||
|
<title>${filename}</title>
|
||||||
|
<style>
|
||||||
|
body {
|
||||||
|
background: #1e1e1e;
|
||||||
|
color: #00ff9d;
|
||||||
|
font-family: monospace;
|
||||||
|
padding: 20px;
|
||||||
|
margin: 0;
|
||||||
|
}
|
||||||
|
pre {
|
||||||
|
white-space: pre-wrap;
|
||||||
|
word-wrap: break-word;
|
||||||
|
}
|
||||||
|
</style>
|
||||||
|
</head>
|
||||||
|
<body>
|
||||||
|
<pre>${formattedData}</pre>
|
||||||
|
</body>
|
||||||
|
</html>
|
||||||
|
`);
|
||||||
|
}
|
||||||
}
|
}
|
||||||
</script>
|
</script>
|
||||||
</body>
|
</body>
|
||||||
|
|||||||
87
deploy.sh
Executable file
87
deploy.sh
Executable file
@@ -0,0 +1,87 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
|
||||||
|
# R00TS Autonomous Deployment Script
|
||||||
|
# This script automates the deployment of the R00TS application
|
||||||
|
|
||||||
|
echo "========================================"
|
||||||
|
echo "R00TS Autonomous Deployment"
|
||||||
|
echo "========================================"
|
||||||
|
|
||||||
|
# Check if MongoDB is installed
|
||||||
|
if ! command -v mongod &> /dev/null; then
|
||||||
|
echo "MongoDB is not installed. Installing MongoDB..."
|
||||||
|
|
||||||
|
# Detect OS and install MongoDB accordingly
|
||||||
|
if [[ "$(uname)" == "Darwin" ]]; then
|
||||||
|
# macOS
|
||||||
|
if command -v brew &> /dev/null; then
|
||||||
|
brew tap mongodb/brew
|
||||||
|
brew install mongodb-community
|
||||||
|
brew services start mongodb-community
|
||||||
|
else
|
||||||
|
echo "Homebrew is required to install MongoDB on macOS."
|
||||||
|
echo "Please install Homebrew first: https://brew.sh/"
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
elif [[ "$(uname)" == "Linux" ]]; then
|
||||||
|
# Linux (Ubuntu/Debian assumed)
|
||||||
|
sudo apt-get update
|
||||||
|
sudo apt-get install -y mongodb
|
||||||
|
sudo systemctl start mongodb
|
||||||
|
else
|
||||||
|
echo "Unsupported operating system. Please install MongoDB manually."
|
||||||
|
echo "Visit: https://www.mongodb.com/docs/manual/installation/"
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
echo "MongoDB installed successfully!"
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Navigate to the project directory
|
||||||
|
cd "$(dirname "$0")"
|
||||||
|
|
||||||
|
# Install PM2 globally if not installed
|
||||||
|
if ! command -v pm2 &> /dev/null; then
|
||||||
|
echo "Installing PM2 process manager..."
|
||||||
|
npm install -g pm2
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Navigate to server directory and install dependencies
|
||||||
|
echo "Setting up server..."
|
||||||
|
cd server
|
||||||
|
|
||||||
|
# Install server dependencies
|
||||||
|
npm install
|
||||||
|
|
||||||
|
# Check if .env file exists, create if not
|
||||||
|
if [ ! -f ".env" ]; then
|
||||||
|
echo "Creating .env file..."
|
||||||
|
cp .env.example .env
|
||||||
|
echo "Please update the .env file with your MongoDB connection string if needed."
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Start the server with PM2
|
||||||
|
echo "Starting R00TS server with PM2..."
|
||||||
|
npm run prod
|
||||||
|
|
||||||
|
# Setup PM2 to start on system boot
|
||||||
|
pm2 startup
|
||||||
|
echo "Run the above command if you want PM2 to start on system boot"
|
||||||
|
|
||||||
|
# Setup PM2 to save current process list
|
||||||
|
pm2 save
|
||||||
|
|
||||||
|
# Display status
|
||||||
|
echo "\nR00TS server is now running!"
|
||||||
|
echo "========================================"
|
||||||
|
echo "Server Status:"
|
||||||
|
pm2 status r00ts-server
|
||||||
|
echo "========================================"
|
||||||
|
echo "Health Check:"
|
||||||
|
curl -s http://localhost:5000/api/health | json_pp || echo "Health check endpoint not accessible"
|
||||||
|
echo "\n========================================"
|
||||||
|
echo "To view server logs: npm run logs"
|
||||||
|
echo "To restart server: npm run restart"
|
||||||
|
echo "To stop server: npm run stop"
|
||||||
|
echo "\nOpen index.html in your browser to use the application"
|
||||||
|
echo "========================================"
|
||||||
59
index.html
59
index.html
@@ -177,6 +177,44 @@
|
|||||||
.r00ts-brand:hover::after {
|
.r00ts-brand:hover::after {
|
||||||
transform: scaleX(1);
|
transform: scaleX(1);
|
||||||
transform-origin: left;
|
transform-origin: left;
|
||||||
|
.header {
|
||||||
|
text-align: center;
|
||||||
|
margin-bottom: 30px;
|
||||||
|
}
|
||||||
|
.header h1 {
|
||||||
|
font-weight: 700;
|
||||||
|
color: #2e2e2e;
|
||||||
|
font-size: 3rem;
|
||||||
|
}
|
||||||
|
.header .lead {
|
||||||
|
font-size: 1.2rem;
|
||||||
|
color: #555;
|
||||||
|
}
|
||||||
|
.input-area {
|
||||||
|
background-color: #ffffff;
|
||||||
|
padding: 20px;
|
||||||
|
border-radius: 10px;
|
||||||
|
box-shadow: 0 4px 6px rgba(0, 0, 0, 0.1);
|
||||||
|
margin-bottom: 30px;
|
||||||
|
}
|
||||||
|
.btn-primary {
|
||||||
|
background-color: #4a6fa5;
|
||||||
|
border: none;
|
||||||
|
padding: 10px 20px;
|
||||||
|
}
|
||||||
|
.btn-primary:hover {
|
||||||
|
background-color: #3a5a8c;
|
||||||
|
}
|
||||||
|
.footer {
|
||||||
|
text-align: center;
|
||||||
|
margin-top: 30px;
|
||||||
|
color: #6c757d;
|
||||||
|
font-size: 0.9rem;
|
||||||
|
}
|
||||||
|
.r00ts-brand {
|
||||||
|
font-family: monospace;
|
||||||
|
font-weight: bold;
|
||||||
|
letter-spacing: 1px;
|
||||||
}
|
}
|
||||||
.stats-area {
|
.stats-area {
|
||||||
display: flex;
|
display: flex;
|
||||||
@@ -274,11 +312,26 @@
|
|||||||
opacity: 0.5;
|
opacity: 0.5;
|
||||||
font-size: 0.8rem;
|
font-size: 0.8rem;
|
||||||
font-family: 'Share Tech Mono', monospace;
|
font-family: 'Share Tech Mono', monospace;
|
||||||
|
=======
|
||||||
|
margin: 20px 0;
|
||||||
|
}
|
||||||
|
.stat-box {
|
||||||
|
text-align: center;
|
||||||
|
padding: 10px;
|
||||||
|
background-color: #ffffff;
|
||||||
|
border-radius: 5px;
|
||||||
|
box-shadow: 0 2px 4px rgba(0, 0, 0, 0.05);
|
||||||
|
flex: 1;
|
||||||
|
margin: 0 10px;
|
||||||
|
>>>>>>> 6fcedc69cfea5193a6809e1f2fb705b42479c5bd
|
||||||
}
|
}
|
||||||
</style>
|
</style>
|
||||||
</head>
|
</head>
|
||||||
<body>
|
<body>
|
||||||
|
<<<<<<< HEAD
|
||||||
<div id="particles-js"></div>
|
<div id="particles-js"></div>
|
||||||
|
=======
|
||||||
|
>>>>>>> 6fcedc69cfea5193a6809e1f2fb705b42479c5bd
|
||||||
<div class="container">
|
<div class="container">
|
||||||
<div class="header">
|
<div class="header">
|
||||||
<h1 class="r00ts-brand">R00TS</h1>
|
<h1 class="r00ts-brand">R00TS</h1>
|
||||||
@@ -288,7 +341,11 @@
|
|||||||
<div class="input-area">
|
<div class="input-area">
|
||||||
<form id="word-form">
|
<form id="word-form">
|
||||||
<div class="mb-3">
|
<div class="mb-3">
|
||||||
|
<<<<<<< HEAD
|
||||||
<label for="word-input" class="form-label">Enter a word to give it power:</label>
|
<label for="word-input" class="form-label">Enter a word to give it power:</label>
|
||||||
|
=======
|
||||||
|
<label for="word-input" class="form-label">Enter a word you want future AI to understand:</label>
|
||||||
|
>>>>>>> 6fcedc69cfea5193a6809e1f2fb705b42479c5bd
|
||||||
<input type="text" class="form-control" id="word-input" placeholder="Type a word..." required>
|
<input type="text" class="form-control" id="word-input" placeholder="Type a word..." required>
|
||||||
</div>
|
</div>
|
||||||
<button type="submit" class="btn btn-primary">Plant Word</button>
|
<button type="submit" class="btn btn-primary">Plant Word</button>
|
||||||
@@ -318,6 +375,8 @@
|
|||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
|
=======
|
||||||
|
>>>>>>> 6fcedc69cfea5193a6809e1f2fb705b42479c5bd
|
||||||
<div class="footer">
|
<div class="footer">
|
||||||
<p>R00TS - Nurturing the future of artificial intelligence, one word at a time.</p>
|
<p>R00TS - Nurturing the future of artificial intelligence, one word at a time.</p>
|
||||||
</div>
|
</div>
|
||||||
|
|||||||
83
script.js
83
script.js
@@ -87,16 +87,23 @@ function initParticles() {
|
|||||||
});
|
});
|
||||||
}
|
}
|
||||||
|
|
||||||
function loadWords() {
|
async function loadWords() {
|
||||||
// In a real implementation, this would be an API call
|
try {
|
||||||
// For demo purposes, we're using localStorage
|
// Use the data manager to get words from API
|
||||||
let words = JSON.parse(localStorage.getItem('roots-words')) || {};
|
let words = await window.dataManager.getCurrentWords();
|
||||||
|
|
||||||
// Update the visualization
|
// Update the visualization
|
||||||
updateWordCloud(words);
|
updateWordCloud(words);
|
||||||
|
|
||||||
// Update statistics
|
// Update statistics
|
||||||
updateStats(words);
|
updateStats(words);
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Error loading words:', error);
|
||||||
|
// Fallback to localStorage if API fails
|
||||||
|
let words = JSON.parse(localStorage.getItem('roots-words')) || {};
|
||||||
|
updateWordCloud(words);
|
||||||
|
updateStats(words);
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
function updateStats(words) {
|
function updateStats(words) {
|
||||||
@@ -107,30 +114,43 @@ function updateStats(words) {
|
|||||||
document.getElementById('unique-count').textContent = uniqueWords;
|
document.getElementById('unique-count').textContent = uniqueWords;
|
||||||
}
|
}
|
||||||
|
|
||||||
function submitWord(word) {
|
async function submitWord(word) {
|
||||||
word = word.trim().toLowerCase();
|
word = word.trim().toLowerCase();
|
||||||
|
|
||||||
if (!word) return false;
|
if (!word) return false;
|
||||||
|
|
||||||
// Create a particle burst effect
|
// Create a particle burst effect
|
||||||
createParticleBurst();
|
if (typeof createParticleBurst === 'function') {
|
||||||
|
createParticleBurst();
|
||||||
|
}
|
||||||
|
|
||||||
// For demo purposes, we're using localStorage
|
try {
|
||||||
let words = JSON.parse(localStorage.getItem('roots-words')) || {};
|
// Use the data manager to add word via API
|
||||||
words[word] = (words[word] || 0) + 1;
|
await window.dataManager.addWord(word);
|
||||||
localStorage.setItem('roots-words', JSON.stringify(words));
|
|
||||||
|
// Update UI with animation if GSAP is available
|
||||||
// Update UI with animation
|
if (typeof gsap !== 'undefined') {
|
||||||
gsap.to('.stat-box', {
|
gsap.to('.stat-box', {
|
||||||
scale: 1.1,
|
scale: 1.1,
|
||||||
duration: 0.2,
|
duration: 0.2,
|
||||||
yoyo: true,
|
yoyo: true,
|
||||||
repeat: 1,
|
repeat: 1,
|
||||||
ease: 'power2.out'
|
ease: 'power2.out'
|
||||||
});
|
});
|
||||||
|
}
|
||||||
loadWords();
|
|
||||||
return true;
|
loadWords();
|
||||||
|
return true;
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Error submitting word:', error);
|
||||||
|
// Fallback to localStorage if API fails
|
||||||
|
let words = JSON.parse(localStorage.getItem('roots-words')) || {};
|
||||||
|
words[word] = (words[word] || 0) + 1;
|
||||||
|
localStorage.setItem('roots-words', JSON.stringify(words));
|
||||||
|
|
||||||
|
loadWords();
|
||||||
|
return true;
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
function createParticleBurst() {
|
function createParticleBurst() {
|
||||||
@@ -175,6 +195,7 @@ function createParticleBurst() {
|
|||||||
}
|
}
|
||||||
);
|
);
|
||||||
}
|
}
|
||||||
|
|
||||||
}
|
}
|
||||||
|
|
||||||
function updateWordCloud(words) {
|
function updateWordCloud(words) {
|
||||||
@@ -195,6 +216,10 @@ function updateWordCloud(words) {
|
|||||||
const topWords = wordData.slice(0, 100);
|
const topWords = wordData.slice(0, 100);
|
||||||
|
|
||||||
if (topWords.length === 0) {
|
if (topWords.length === 0) {
|
||||||
|
<<<<<<< HEAD
|
||||||
|
=======
|
||||||
|
// Show placeholder if no words
|
||||||
|
>>>>>>> 6fcedc69cfea5193a6809e1f2fb705b42479c5bd
|
||||||
container.innerHTML = '<div class="d-flex justify-content-center align-items-center h-100"><p class="text-muted">Plant some words to see them grow here!</p></div>';
|
container.innerHTML = '<div class="d-flex justify-content-center align-items-center h-100"><p class="text-muted">Plant some words to see them grow here!</p></div>';
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
@@ -209,6 +234,7 @@ function updateWordCloud(words) {
|
|||||||
.attr('width', width)
|
.attr('width', width)
|
||||||
.attr('height', height)
|
.attr('height', height)
|
||||||
.append('g')
|
.append('g')
|
||||||
|
<<<<<<< HEAD
|
||||||
.attr('transform', `translate(${width/2}, ${height * 0.8})`);
|
.attr('transform', `translate(${width/2}, ${height * 0.8})`);
|
||||||
|
|
||||||
// Create tree trunk
|
// Create tree trunk
|
||||||
@@ -305,7 +331,6 @@ function updateWordCloud(words) {
|
|||||||
.attr('alignment-baseline', 'middle');
|
.attr('alignment-baseline', 'middle');
|
||||||
});
|
});
|
||||||
}
|
}
|
||||||
}
|
|
||||||
|
|
||||||
// Function to share words
|
// Function to share words
|
||||||
function shareResults() {
|
function shareResults() {
|
||||||
|
|||||||
2
server/.env
Normal file
2
server/.env
Normal file
@@ -0,0 +1,2 @@
|
|||||||
|
PORT=5000
|
||||||
|
MONGODB_URI=mongodb://localhost:27017/r00ts
|
||||||
4
server/.env.example
Normal file
4
server/.env.example
Normal file
@@ -0,0 +1,4 @@
|
|||||||
|
PORT=5000
|
||||||
|
MONGODB_URI=mongodb://localhost:27017/r00ts
|
||||||
|
# For production, use MongoDB Atlas or other cloud database
|
||||||
|
# MONGODB_URI=mongodb+srv://<username>:<password>@cluster.mongodb.net/r00ts
|
||||||
82
server/README.md
Normal file
82
server/README.md
Normal file
@@ -0,0 +1,82 @@
|
|||||||
|
# R00TS Backend Server
|
||||||
|
|
||||||
|
This is the backend server for the R00TS application, providing API endpoints for word and dataset management.
|
||||||
|
|
||||||
|
## Features
|
||||||
|
|
||||||
|
- RESTful API for word submissions and retrieval
|
||||||
|
- MongoDB integration for persistent data storage
|
||||||
|
- Automatic dataset creation and backup
|
||||||
|
- Production-ready configuration
|
||||||
|
|
||||||
|
## Prerequisites
|
||||||
|
|
||||||
|
- Node.js (v14 or higher)
|
||||||
|
- MongoDB (local installation or MongoDB Atlas account)
|
||||||
|
|
||||||
|
## Installation
|
||||||
|
|
||||||
|
1. Clone the repository (if you haven't already)
|
||||||
|
2. Navigate to the server directory:
|
||||||
|
```
|
||||||
|
cd server
|
||||||
|
```
|
||||||
|
3. Install dependencies:
|
||||||
|
```
|
||||||
|
npm install
|
||||||
|
```
|
||||||
|
4. Create a `.env` file based on the `.env.example` template:
|
||||||
|
```
|
||||||
|
cp .env.example .env
|
||||||
|
```
|
||||||
|
5. Update the `.env` file with your MongoDB connection string
|
||||||
|
|
||||||
|
## Running the Server
|
||||||
|
|
||||||
|
### Development Mode
|
||||||
|
|
||||||
|
```
|
||||||
|
npm run dev
|
||||||
|
```
|
||||||
|
|
||||||
|
This will start the server with nodemon, which automatically restarts when changes are detected.
|
||||||
|
|
||||||
|
### Production Mode
|
||||||
|
|
||||||
|
```
|
||||||
|
npm start
|
||||||
|
```
|
||||||
|
|
||||||
|
## API Endpoints
|
||||||
|
|
||||||
|
### Words
|
||||||
|
|
||||||
|
- `GET /api/words` - Get all words
|
||||||
|
- `POST /api/words` - Add or update a word
|
||||||
|
- `GET /api/words/stats` - Get word statistics
|
||||||
|
|
||||||
|
### Datasets
|
||||||
|
|
||||||
|
- `GET /api/datasets` - Get all datasets (limited info)
|
||||||
|
- `GET /api/datasets/:filename` - Get a specific dataset by filename
|
||||||
|
- `POST /api/datasets` - Create a new dataset snapshot
|
||||||
|
- `GET /api/datasets/recent/list` - Get recent datasets (limited to 5)
|
||||||
|
|
||||||
|
## Deployment
|
||||||
|
|
||||||
|
For production deployment, we recommend:
|
||||||
|
|
||||||
|
1. Set up a MongoDB Atlas cluster for your database
|
||||||
|
2. Update the `.env` file with your production MongoDB URI
|
||||||
|
3. Deploy to a hosting service like Heroku, Vercel, or DigitalOcean
|
||||||
|
|
||||||
|
## Data Migration
|
||||||
|
|
||||||
|
If you have existing data in localStorage that you want to migrate to the database:
|
||||||
|
|
||||||
|
1. Export your localStorage data
|
||||||
|
2. Use the import functionality (coming soon) to upload to the server
|
||||||
|
|
||||||
|
## License
|
||||||
|
|
||||||
|
See the main project license file.
|
||||||
27
server/models/Dataset.js
Normal file
27
server/models/Dataset.js
Normal file
@@ -0,0 +1,27 @@
|
|||||||
|
const mongoose = require('mongoose');
|
||||||
|
|
||||||
|
const DatasetSchema = new mongoose.Schema({
|
||||||
|
filename: {
|
||||||
|
type: String,
|
||||||
|
required: true,
|
||||||
|
unique: true
|
||||||
|
},
|
||||||
|
timestamp: {
|
||||||
|
type: Date,
|
||||||
|
default: Date.now
|
||||||
|
},
|
||||||
|
wordCount: {
|
||||||
|
type: Number,
|
||||||
|
required: true
|
||||||
|
},
|
||||||
|
totalSubmissions: {
|
||||||
|
type: Number,
|
||||||
|
required: true
|
||||||
|
},
|
||||||
|
data: {
|
||||||
|
type: Object,
|
||||||
|
required: true
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
module.exports = mongoose.model('Dataset', DatasetSchema);
|
||||||
26
server/models/Word.js
Normal file
26
server/models/Word.js
Normal file
@@ -0,0 +1,26 @@
|
|||||||
|
const mongoose = require('mongoose');
|
||||||
|
|
||||||
|
const WordSchema = new mongoose.Schema({
|
||||||
|
word: {
|
||||||
|
type: String,
|
||||||
|
required: true,
|
||||||
|
trim: true,
|
||||||
|
lowercase: true,
|
||||||
|
unique: true
|
||||||
|
},
|
||||||
|
count: {
|
||||||
|
type: Number,
|
||||||
|
required: true,
|
||||||
|
default: 1
|
||||||
|
},
|
||||||
|
createdAt: {
|
||||||
|
type: Date,
|
||||||
|
default: Date.now
|
||||||
|
},
|
||||||
|
updatedAt: {
|
||||||
|
type: Date,
|
||||||
|
default: Date.now
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
module.exports = mongoose.model('Word', WordSchema);
|
||||||
29
server/package.json
Normal file
29
server/package.json
Normal file
@@ -0,0 +1,29 @@
|
|||||||
|
{
|
||||||
|
"name": "r00ts-server",
|
||||||
|
"version": "1.0.0",
|
||||||
|
"description": "Backend server for R00TS application",
|
||||||
|
"main": "server.js",
|
||||||
|
"scripts": {
|
||||||
|
"start": "node server.js",
|
||||||
|
"dev": "nodemon server.js",
|
||||||
|
"prod": "pm2 start server.js --name r00ts-server",
|
||||||
|
"stop": "pm2 stop r00ts-server",
|
||||||
|
"restart": "pm2 restart r00ts-server",
|
||||||
|
"status": "pm2 status r00ts-server",
|
||||||
|
"logs": "pm2 logs r00ts-server",
|
||||||
|
"backup": "node scripts/backup.js"
|
||||||
|
},
|
||||||
|
"dependencies": {
|
||||||
|
"cors": "^2.8.5",
|
||||||
|
"dotenv": "^16.3.1",
|
||||||
|
"express": "^4.18.2",
|
||||||
|
"mongoose": "^7.5.0",
|
||||||
|
"morgan": "^1.10.0",
|
||||||
|
"pm2": "^5.3.0",
|
||||||
|
"cron": "^2.4.0",
|
||||||
|
"mongodb-backup": "^1.6.9"
|
||||||
|
},
|
||||||
|
"devDependencies": {
|
||||||
|
"nodemon": "^3.0.1"
|
||||||
|
}
|
||||||
|
}
|
||||||
86
server/routes/datasets.js
Normal file
86
server/routes/datasets.js
Normal file
@@ -0,0 +1,86 @@
|
|||||||
|
const express = require('express');
|
||||||
|
const router = express.Router();
|
||||||
|
const Dataset = require('../models/Dataset');
|
||||||
|
const Word = require('../models/Word');
|
||||||
|
|
||||||
|
// Get all datasets (limited info)
|
||||||
|
router.get('/', async (req, res) => {
|
||||||
|
try {
|
||||||
|
const datasets = await Dataset.find().select('-data').sort({ timestamp: -1 });
|
||||||
|
res.json(datasets);
|
||||||
|
} catch (err) {
|
||||||
|
console.error('Error fetching datasets:', err);
|
||||||
|
res.status(500).json({ message: 'Server error' });
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
// Get a specific dataset by filename
|
||||||
|
router.get('/:filename', async (req, res) => {
|
||||||
|
try {
|
||||||
|
const dataset = await Dataset.findOne({ filename: req.params.filename });
|
||||||
|
|
||||||
|
if (!dataset) {
|
||||||
|
return res.status(404).json({ message: 'Dataset not found' });
|
||||||
|
}
|
||||||
|
|
||||||
|
res.json(dataset);
|
||||||
|
} catch (err) {
|
||||||
|
console.error('Error fetching dataset:', err);
|
||||||
|
res.status(500).json({ message: 'Server error' });
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
// Create a new dataset snapshot
|
||||||
|
router.post('/', async (req, res) => {
|
||||||
|
try {
|
||||||
|
// Get all words from the database
|
||||||
|
const words = await Word.find();
|
||||||
|
|
||||||
|
// Format data to match the existing structure
|
||||||
|
const formattedWords = {};
|
||||||
|
words.forEach(word => {
|
||||||
|
formattedWords[word.word] = word.count;
|
||||||
|
});
|
||||||
|
|
||||||
|
// Calculate stats
|
||||||
|
const wordCount = words.length;
|
||||||
|
const totalSubmissions = words.reduce((sum, word) => sum + word.count, 0);
|
||||||
|
|
||||||
|
// Create filename with timestamp
|
||||||
|
const timestamp = new Date().toISOString().replace(/[:.]/g, '-');
|
||||||
|
const filename = `roots_dataset_${timestamp}.json`;
|
||||||
|
|
||||||
|
// Create new dataset
|
||||||
|
const newDataset = new Dataset({
|
||||||
|
filename,
|
||||||
|
timestamp: new Date(),
|
||||||
|
wordCount,
|
||||||
|
totalSubmissions,
|
||||||
|
data: formattedWords
|
||||||
|
});
|
||||||
|
|
||||||
|
await newDataset.save();
|
||||||
|
|
||||||
|
res.status(201).json(newDataset);
|
||||||
|
} catch (err) {
|
||||||
|
console.error('Error creating dataset:', err);
|
||||||
|
res.status(500).json({ message: 'Server error' });
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
// Get recent datasets (limited to 5)
|
||||||
|
router.get('/recent/list', async (req, res) => {
|
||||||
|
try {
|
||||||
|
const datasets = await Dataset.find()
|
||||||
|
.select('-data')
|
||||||
|
.sort({ timestamp: -1 })
|
||||||
|
.limit(5);
|
||||||
|
|
||||||
|
res.json(datasets);
|
||||||
|
} catch (err) {
|
||||||
|
console.error('Error fetching recent datasets:', err);
|
||||||
|
res.status(500).json({ message: 'Server error' });
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
module.exports = router;
|
||||||
66
server/routes/words.js
Normal file
66
server/routes/words.js
Normal file
@@ -0,0 +1,66 @@
|
|||||||
|
const express = require('express');
|
||||||
|
const router = express.Router();
|
||||||
|
const Word = require('../models/Word');
|
||||||
|
|
||||||
|
// Get all words
|
||||||
|
router.get('/', async (req, res) => {
|
||||||
|
try {
|
||||||
|
const words = await Word.find();
|
||||||
|
|
||||||
|
// Format data to match the existing structure
|
||||||
|
const formattedWords = {};
|
||||||
|
words.forEach(word => {
|
||||||
|
formattedWords[word.word] = word.count;
|
||||||
|
});
|
||||||
|
|
||||||
|
res.json(formattedWords);
|
||||||
|
} catch (err) {
|
||||||
|
console.error('Error fetching words:', err);
|
||||||
|
res.status(500).json({ message: 'Server error' });
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
// Add or update a word
|
||||||
|
router.post('/', async (req, res) => {
|
||||||
|
try {
|
||||||
|
const { word } = req.body;
|
||||||
|
|
||||||
|
if (!word || typeof word !== 'string') {
|
||||||
|
return res.status(400).json({ message: 'Word is required and must be a string' });
|
||||||
|
}
|
||||||
|
|
||||||
|
const normalizedWord = word.trim().toLowerCase();
|
||||||
|
|
||||||
|
// Find and update if exists, or create new
|
||||||
|
const updatedWord = await Word.findOneAndUpdate(
|
||||||
|
{ word: normalizedWord },
|
||||||
|
{ $inc: { count: 1 }, updatedAt: Date.now() },
|
||||||
|
{ new: true, upsert: true }
|
||||||
|
);
|
||||||
|
|
||||||
|
res.json(updatedWord);
|
||||||
|
} catch (err) {
|
||||||
|
console.error('Error adding word:', err);
|
||||||
|
res.status(500).json({ message: 'Server error' });
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
// Get statistics
|
||||||
|
router.get('/stats', async (req, res) => {
|
||||||
|
try {
|
||||||
|
const totalWords = await Word.countDocuments();
|
||||||
|
const totalSubmissions = await Word.aggregate([
|
||||||
|
{ $group: { _id: null, total: { $sum: '$count' } } }
|
||||||
|
]);
|
||||||
|
|
||||||
|
res.json({
|
||||||
|
uniqueWords: totalWords,
|
||||||
|
totalSubmissions: totalSubmissions.length > 0 ? totalSubmissions[0].total : 0
|
||||||
|
});
|
||||||
|
} catch (err) {
|
||||||
|
console.error('Error fetching stats:', err);
|
||||||
|
res.status(500).json({ message: 'Server error' });
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
module.exports = router;
|
||||||
85
server/scripts/backup.js
Normal file
85
server/scripts/backup.js
Normal file
@@ -0,0 +1,85 @@
|
|||||||
|
/**
|
||||||
|
* R00TS Automated Database Backup Script
|
||||||
|
* This script creates backups of the MongoDB database and manages backup rotation
|
||||||
|
*/
|
||||||
|
|
||||||
|
require('dotenv').config({ path: '../.env' });
|
||||||
|
const backup = require('mongodb-backup');
|
||||||
|
const fs = require('fs');
|
||||||
|
const path = require('path');
|
||||||
|
const { CronJob } = require('cron');
|
||||||
|
|
||||||
|
// Create backups directory if it doesn't exist
|
||||||
|
const backupDir = path.join(__dirname, '../backups');
|
||||||
|
if (!fs.existsSync(backupDir)) {
|
||||||
|
fs.mkdirSync(backupDir, { recursive: true });
|
||||||
|
console.log(`Created backups directory at ${backupDir}`);
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Perform MongoDB backup
|
||||||
|
*/
|
||||||
|
function performBackup() {
|
||||||
|
const timestamp = new Date().toISOString().replace(/[:.]/g, '-');
|
||||||
|
const backupPath = path.join(backupDir, `backup-${timestamp}`);
|
||||||
|
|
||||||
|
console.log(`Starting backup at ${new Date().toLocaleString()}...`);
|
||||||
|
|
||||||
|
backup({
|
||||||
|
uri: process.env.MONGODB_URI,
|
||||||
|
root: backupPath,
|
||||||
|
callback: function(err) {
|
||||||
|
if (err) {
|
||||||
|
console.error('Backup failed:', err);
|
||||||
|
} else {
|
||||||
|
console.log(`Backup completed successfully at ${backupPath}`);
|
||||||
|
// Rotate backups (keep only the last 7 backups)
|
||||||
|
rotateBackups();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Rotate backups to keep only the most recent ones
|
||||||
|
*/
|
||||||
|
function rotateBackups() {
|
||||||
|
fs.readdir(backupDir, (err, files) => {
|
||||||
|
if (err) {
|
||||||
|
console.error('Error reading backup directory:', err);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Sort files by creation time (oldest first)
|
||||||
|
const sortedFiles = files.map(file => ({
|
||||||
|
name: file,
|
||||||
|
path: path.join(backupDir, file),
|
||||||
|
time: fs.statSync(path.join(backupDir, file)).birthtime
|
||||||
|
})).sort((a, b) => a.time - b.time);
|
||||||
|
|
||||||
|
// Keep only the 7 most recent backups
|
||||||
|
const MAX_BACKUPS = 7;
|
||||||
|
if (sortedFiles.length > MAX_BACKUPS) {
|
||||||
|
const filesToDelete = sortedFiles.slice(0, sortedFiles.length - MAX_BACKUPS);
|
||||||
|
filesToDelete.forEach(file => {
|
||||||
|
fs.rm(file.path, { recursive: true, force: true }, (err) => {
|
||||||
|
if (err) {
|
||||||
|
console.error(`Error deleting old backup ${file.name}:`, err);
|
||||||
|
} else {
|
||||||
|
console.log(`Deleted old backup: ${file.name}`);
|
||||||
|
}
|
||||||
|
});
|
||||||
|
});
|
||||||
|
}
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
// If this script is run directly, perform a backup immediately
|
||||||
|
if (require.main === module) {
|
||||||
|
performBackup();
|
||||||
|
}
|
||||||
|
|
||||||
|
// Schedule automatic backups (daily at 3:00 AM)
|
||||||
|
const backupJob = new CronJob('0 3 * * *', performBackup, null, true);
|
||||||
|
|
||||||
|
module.exports = { performBackup, backupJob };
|
||||||
176
server/server.js
Normal file
176
server/server.js
Normal file
@@ -0,0 +1,176 @@
|
|||||||
|
require('dotenv').config();
|
||||||
|
const express = require('express');
|
||||||
|
const mongoose = require('mongoose');
|
||||||
|
const cors = require('cors');
|
||||||
|
const morgan = require('morgan');
|
||||||
|
const path = require('path');
|
||||||
|
const fs = require('fs');
|
||||||
|
const { CronJob } = require('cron');
|
||||||
|
|
||||||
|
// Create logs directory if it doesn't exist
|
||||||
|
const logsDir = path.join(__dirname, 'logs');
|
||||||
|
if (!fs.existsSync(logsDir)) {
|
||||||
|
fs.mkdirSync(logsDir, { recursive: true });
|
||||||
|
console.log(`Created logs directory at ${logsDir}`);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Configure logging
|
||||||
|
const logStream = fs.createWriteStream(
|
||||||
|
path.join(logsDir, `server-${new Date().toISOString().split('T')[0]}.log`),
|
||||||
|
{ flags: 'a' }
|
||||||
|
);
|
||||||
|
|
||||||
|
// Redirect console output to log file and console
|
||||||
|
const originalConsoleLog = console.log;
|
||||||
|
const originalConsoleError = console.error;
|
||||||
|
|
||||||
|
console.log = function() {
|
||||||
|
const args = Array.from(arguments);
|
||||||
|
const timestamp = new Date().toISOString();
|
||||||
|
const logMessage = `[${timestamp}] INFO: ${args.join(' ')}
|
||||||
|
`;
|
||||||
|
logStream.write(logMessage);
|
||||||
|
originalConsoleLog.apply(console, arguments);
|
||||||
|
};
|
||||||
|
|
||||||
|
console.error = function() {
|
||||||
|
const args = Array.from(arguments);
|
||||||
|
const timestamp = new Date().toISOString();
|
||||||
|
const logMessage = `[${timestamp}] ERROR: ${args.join(' ')}
|
||||||
|
`;
|
||||||
|
logStream.write(logMessage);
|
||||||
|
originalConsoleError.apply(console, arguments);
|
||||||
|
};
|
||||||
|
|
||||||
|
// Import routes
|
||||||
|
const wordRoutes = require('./routes/words');
|
||||||
|
const datasetRoutes = require('./routes/datasets');
|
||||||
|
|
||||||
|
const app = express();
|
||||||
|
const PORT = process.env.PORT || 5000;
|
||||||
|
|
||||||
|
// Middleware
|
||||||
|
app.use(cors());
|
||||||
|
app.use(express.json());
|
||||||
|
app.use(morgan('dev'));
|
||||||
|
|
||||||
|
// Serve static files from the React app
|
||||||
|
app.use(express.static(path.join(__dirname, '..')));
|
||||||
|
|
||||||
|
// Connect to MongoDB with retry logic
|
||||||
|
const connectWithRetry = () => {
|
||||||
|
console.log('Attempting to connect to MongoDB...');
|
||||||
|
mongoose.connect(process.env.MONGODB_URI, {
|
||||||
|
useNewUrlParser: true,
|
||||||
|
useUnifiedTopology: true,
|
||||||
|
})
|
||||||
|
.then(() => {
|
||||||
|
console.log('MongoDB connected successfully');
|
||||||
|
// Initialize scheduled tasks after successful connection
|
||||||
|
initializeScheduledTasks();
|
||||||
|
})
|
||||||
|
.catch(err => {
|
||||||
|
console.error('MongoDB connection error:', err);
|
||||||
|
console.log('Retrying connection in 5 seconds...');
|
||||||
|
setTimeout(connectWithRetry, 5000);
|
||||||
|
});
|
||||||
|
};
|
||||||
|
|
||||||
|
// Handle MongoDB disconnection (auto-reconnect)
|
||||||
|
mongoose.connection.on('disconnected', () => {
|
||||||
|
console.log('MongoDB disconnected! Attempting to reconnect...');
|
||||||
|
connectWithRetry();
|
||||||
|
});
|
||||||
|
|
||||||
|
// Initial connection
|
||||||
|
connectWithRetry();
|
||||||
|
|
||||||
|
// Initialize scheduled tasks
|
||||||
|
function initializeScheduledTasks() {
|
||||||
|
// Import backup script
|
||||||
|
const { performBackup } = require('./scripts/backup');
|
||||||
|
|
||||||
|
// Schedule daily backup at 3:00 AM
|
||||||
|
const backupJob = new CronJob('0 3 * * *', performBackup, null, true);
|
||||||
|
console.log('Scheduled automatic database backup job');
|
||||||
|
|
||||||
|
// Schedule weekly database maintenance at 2:00 AM on Sundays
|
||||||
|
const maintenanceJob = new CronJob('0 2 * * 0', async () => {
|
||||||
|
console.log('Running weekly database maintenance...');
|
||||||
|
try {
|
||||||
|
// Perform any maintenance tasks here
|
||||||
|
// For example, compact collections, validate data integrity, etc.
|
||||||
|
console.log('Database maintenance completed successfully');
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Database maintenance error:', error);
|
||||||
|
}
|
||||||
|
}, null, true);
|
||||||
|
console.log('Scheduled weekly database maintenance job');
|
||||||
|
}
|
||||||
|
|
||||||
|
// Health check endpoint
|
||||||
|
app.get('/api/health', (req, res) => {
|
||||||
|
const dbStatus = mongoose.connection.readyState === 1 ? 'connected' : 'disconnected';
|
||||||
|
|
||||||
|
res.json({
|
||||||
|
status: 'ok',
|
||||||
|
timestamp: new Date().toISOString(),
|
||||||
|
server: {
|
||||||
|
uptime: process.uptime(),
|
||||||
|
memory: process.memoryUsage(),
|
||||||
|
},
|
||||||
|
database: {
|
||||||
|
status: dbStatus
|
||||||
|
}
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
// API Routes
|
||||||
|
app.use('/api/words', wordRoutes);
|
||||||
|
app.use('/api/datasets', datasetRoutes);
|
||||||
|
|
||||||
|
// Serve the main HTML file for any other request
|
||||||
|
app.get('*', (req, res) => {
|
||||||
|
res.sendFile(path.join(__dirname, '..', 'index.html'));
|
||||||
|
});
|
||||||
|
|
||||||
|
// Error handling middleware
|
||||||
|
app.use((err, req, res, next) => {
|
||||||
|
console.error('Unhandled error:', err);
|
||||||
|
res.status(500).json({ error: 'Internal server error', message: err.message });
|
||||||
|
});
|
||||||
|
|
||||||
|
// Handle uncaught exceptions
|
||||||
|
process.on('uncaughtException', (err) => {
|
||||||
|
console.error('Uncaught exception:', err);
|
||||||
|
// Keep the process alive but log the error
|
||||||
|
});
|
||||||
|
|
||||||
|
// Handle unhandled promise rejections
|
||||||
|
process.on('unhandledRejection', (reason, promise) => {
|
||||||
|
console.error('Unhandled promise rejection:', reason);
|
||||||
|
// Keep the process alive but log the error
|
||||||
|
});
|
||||||
|
|
||||||
|
// Start the server
|
||||||
|
const server = app.listen(PORT, () => {
|
||||||
|
console.log(`R00TS server running on port ${PORT}`);
|
||||||
|
console.log(`Server time: ${new Date().toLocaleString()}`);
|
||||||
|
console.log(`Environment: ${process.env.NODE_ENV || 'development'}`);
|
||||||
|
console.log(`MongoDB URI: ${process.env.MONGODB_URI.replace(/\/.*@/, '//***:***@')}`);
|
||||||
|
console.log('Server is ready to accept connections');
|
||||||
|
});
|
||||||
|
|
||||||
|
// Graceful shutdown
|
||||||
|
process.on('SIGTERM', () => {
|
||||||
|
console.log('SIGTERM received, shutting down gracefully');
|
||||||
|
server.close(() => {
|
||||||
|
console.log('Server closed');
|
||||||
|
mongoose.connection.close(false, () => {
|
||||||
|
console.log('MongoDB connection closed');
|
||||||
|
process.exit(0);
|
||||||
|
});
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
module.exports = app; // Export for testing
|
||||||
31
setup.sh
Normal file
31
setup.sh
Normal file
@@ -0,0 +1,31 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
|
||||||
|
# R00TS Setup Script
|
||||||
|
echo "Setting up R00TS production environment..."
|
||||||
|
|
||||||
|
# Check if MongoDB is installed
|
||||||
|
if ! command -v mongod &> /dev/null; then
|
||||||
|
echo "MongoDB is not installed. Please install MongoDB first."
|
||||||
|
echo "Visit: https://www.mongodb.com/docs/manual/installation/"
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Navigate to server directory
|
||||||
|
cd "$(dirname "$0")/server"
|
||||||
|
|
||||||
|
# Install dependencies
|
||||||
|
echo "Installing server dependencies..."
|
||||||
|
npm install
|
||||||
|
|
||||||
|
# Check if .env file exists, create if not
|
||||||
|
if [ ! -f ".env" ]; then
|
||||||
|
echo "Creating .env file..."
|
||||||
|
cp .env.example .env
|
||||||
|
echo "Please update the .env file with your MongoDB connection string if needed."
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Start the server
|
||||||
|
echo "Starting R00TS server..."
|
||||||
|
echo "The server will be available at http://localhost:5000"
|
||||||
|
echo "Open index.html in your browser to use the application"
|
||||||
|
npm start
|
||||||
75
start_r00ts.sh
Executable file
75
start_r00ts.sh
Executable file
@@ -0,0 +1,75 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
|
||||||
|
# R00TS Easy Startup Script
|
||||||
|
echo "========================================"
|
||||||
|
echo "Starting R00TS Application"
|
||||||
|
echo "========================================"
|
||||||
|
|
||||||
|
cd "$(dirname "$0")"
|
||||||
|
|
||||||
|
# Check if server is already running
|
||||||
|
if pgrep -f "node.*server.js" > /dev/null; then
|
||||||
|
echo "R00TS server is already running!"
|
||||||
|
else
|
||||||
|
# Check if MongoDB is running
|
||||||
|
if ! pgrep -x mongod > /dev/null; then
|
||||||
|
echo "Starting MongoDB..."
|
||||||
|
if [[ "$(uname)" == "Darwin" ]]; then
|
||||||
|
# macOS
|
||||||
|
brew services start mongodb-community || mongod --config /usr/local/etc/mongod.conf --fork
|
||||||
|
elif [[ "$(uname)" == "Linux" ]]; then
|
||||||
|
# Linux
|
||||||
|
sudo systemctl start mongodb || sudo service mongodb start
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Start the server
|
||||||
|
echo "Starting R00TS server..."
|
||||||
|
cd server
|
||||||
|
npm start &
|
||||||
|
cd ..
|
||||||
|
echo "Server starting in background..."
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Wait for server to start
|
||||||
|
echo "Waiting for server to start..."
|
||||||
|
sleep 3
|
||||||
|
|
||||||
|
# Check if server is running
|
||||||
|
if curl -s http://localhost:5000/api/health > /dev/null; then
|
||||||
|
echo "Server is running!"
|
||||||
|
|
||||||
|
# Open the application in the default browser
|
||||||
|
echo "Opening R00TS in your browser..."
|
||||||
|
|
||||||
|
if [[ "$(uname)" == "Darwin" ]]; then
|
||||||
|
# macOS
|
||||||
|
open "http://localhost:5000"
|
||||||
|
elif [[ "$(uname)" == "Linux" ]]; then
|
||||||
|
# Linux
|
||||||
|
xdg-open "http://localhost:5000" || firefox "http://localhost:5000" || google-chrome "http://localhost:5000"
|
||||||
|
elif [[ "$(uname)" == "MINGW"* ]]; then
|
||||||
|
# Windows
|
||||||
|
start "http://localhost:5000"
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Open dashboard in browser
|
||||||
|
echo "Opening admin dashboard..."
|
||||||
|
if [[ "$(uname)" == "Darwin" ]]; then
|
||||||
|
# macOS
|
||||||
|
open "dashboard.html"
|
||||||
|
elif [[ "$(uname)" == "Linux" ]]; then
|
||||||
|
# Linux
|
||||||
|
xdg-open "dashboard.html" || firefox "dashboard.html" || google-chrome "dashboard.html"
|
||||||
|
elif [[ "$(uname)" == "MINGW"* ]]; then
|
||||||
|
# Windows
|
||||||
|
start "dashboard.html"
|
||||||
|
fi
|
||||||
|
else
|
||||||
|
echo "Server failed to start. Please check the logs in server/logs directory."
|
||||||
|
fi
|
||||||
|
|
||||||
|
echo "========================================"
|
||||||
|
echo "R00TS is now ready!"
|
||||||
|
echo "To stop the server, run: cd server && npm stop"
|
||||||
|
echo "========================================"
|
||||||
Reference in New Issue
Block a user