A small rule-based expert system that evaluates a user's digital privacy and security posture and produces prioritized recommendations. The project is implemented in Python with a modern Streamlit web GUI, a simple testable inference engine, and optional CLIPS rule files for the knowledge base.
pip install -r requirements.txt
streamlit run app.pyThen open your browser to http://localhost:8501 and start the privacy assessment!
An expert system emulates decision-making of a domain expert using a set of explicitly encoded rules and facts. Core properties:
- Knowledge base: domain knowledge encoded separately from code (rules, templates). In this app:
clips/knowledge_base.clpandclips/templates.clp. - Working memory (facts): the current situation described as facts. In this app: a
user-profilefact fromclips/sample_facts.clp(or user inputs collected byInputHandler). - Inference engine: applies rules to facts to derive new facts/recommendations. In this app: CLIPS runtime (optional) and a pure-Python
InferenceEngineinsrc/inference_engine.py. - Reasoning strategy: we use forward chaining (data-driven). When facts match rule conditions, actions assert
recommendationfacts. - Conflict resolution: when multiple rules are eligible, priority resolves order. In CLIPS, this can be managed with salience; in Python we encode a
priorityandrisk-scoreto sort outputs. - Explanation facility: the system can explain outcomes in human terms. In this app, each recommendation carries
message,details, andaction. You can extend this to include rule IDs for a why-trace. - Uncertainty handling: many expert systems attach confidence/weights. Here we approximate with an additive
risk-scoreper recommendation and compute an overall risk level. - Separation of knowledge and control: rules (knowledge) live in CLIPS files; control/UI lives in Python, making knowledge editable without changing code.
- Modularity and maintainability: rules are small and focused; templates define a clear schema for facts and outputs.
- Knowledge base:
clips/knowledge_base.clp,clips/templates.clp - Facts / Working memory:
clips/sample_facts.clp; at runtime, facts can be asserted from user input - Inference engine:
src/inference_engine.py(Python), optional CLIPS runtime viaclipspy - Recommendations / Explanation:
deftemplate recommendationwithmessage,details,action;src/output_handler.pyfor sorting/summary - Control and UI:
src/app_controller.py,gui/
Forward chaining starts from known facts and repeatedly fires rules whose conditions match those facts, asserting new facts (recommendations) along the way. Example excerpt from clips/knowledge_base.clp:
(defrule no-vpn-rule
(user-profile (vpn no))
=>
(assert (recommendation
(priority medium)
(category "Network Security")
(message "Consider using a VPN for all internet activity")
(details "VPNs protect your privacy by hiding your IP address and encrypting traffic.")
(action "Research and subscribe to a reputable VPN service")
(risk-score 12))))When the working memory contains (user-profile (vpn no)), this rule fires and asserts a recommendation fact capturing the explanation and action.
- Facts describing a user's profile are represented as CLIPS facts (see
clips/sample_facts.clp). - Domain rules and recommendations are authored in CLIPS (
clips/knowledge_base.clp) using templates defined inclips/templates.clp. - A pure-Python inference engine (
src/inference_engine.py) provides an alternative, testable rule implementation that the GUI and controller use when CLIPS is not required. src/app_controller.pywires input handling, inference, and output formatting for use by the (optional) GUI undergui/.
clips/— CLIPS rule files and templatestemplates.clp— CLIPS deftemplate definitionsknowledge_base.clp— CLIPS rules that assertrecommendationfactssample_facts.clp— example facts used for testing
src/— Python application codeinference_engine.py— Python rule-based inference implementation (used in tests)input_handler.py— input validation and conversionoutput_handler.py— formatting and ranking of recommendationsapp_controller.py— top-level controller (ties together input, inference, output)main.py— small runner for the application (see below)
gui/— optional GUI components (PyQt/Tkinter, etc.)tests/— pytest tests for the repositoryscripts/— helper scripts (parsing, diagnostics)
See INSTALL.md for detailed installation instructions and troubleshooting.
Quick start (Windows):
python -m venv .venv
.venv\Scripts\activate
python -m pip install --upgrade pip setuptools wheel
pip install -r requirements.txt --no-cache-dirOn macOS/Linux:
python -m venv .venv
source .venv/bin/activate
python -m pip install --upgrade pip setuptools wheel
pip install -r requirements.txt --no-cache-dirVerify installation:
python -c "import streamlit, google.generativeai; print('✓ Ready to go!')"streamlit run app.pyThen select "Structured Assessment" from the sidebar. This will:
- Ask 10 targeted privacy questions
- Guide you through the assessment with navigation buttons
- Show prioritized recommendations (🔴 HIGH, 🟡 MEDIUM, 🟢 LOW)
- Display your overall risk score
streamlit run app.pyThen select "AI Chatbot" from the sidebar. This will:
- Enable you to ask any privacy/security question in natural language
- Use Google's Gemini API to provide intelligent responses
- Maintain conversation context across multiple exchanges
- Cover all aspects of digital privacy and security
To use the chatbot:
- Get a free Gemini API key: https://aistudio.google.com/app/apikey
- Paste the key in the sidebar when prompted
- Start asking questions!
The original CLI chat interface (not recommended) is in src/main.py:
python -m src.mainThis will:
- Open your default web browser to
http://localhost:8501 - Display an interactive chat-style assessment with 10 questions
- Allow you to navigate forward/backward through questions
- Show a progress bar and results with color-coded priority levels
- Provide detailed recommendations sorted by priority (🔴 HIGH, 🟡 MEDIUM, 🟢 LOW)
- Let you start over and retake the assessment anytime
The GUI runs on pure Python (does not require CLIPS). To enable CLIPS file parsing for advanced testing, clipspy is already included in requirements.txt.
Run the test suite with pytest from the repository root:
python -m pytest -qNotes about tests
tests/test_clips.pychecks that the expected CLIPS files exist and, ifclipspyis installed, attempts to load and run them. Ifclipspyis not installed the runtime portion of the test is skipped.- The Python
InferenceEnginehas its own behavior tests (you can add more unit tests undertests/for edge cases).
- Templates first:
clips/templates.clpdefinesdeftemplate user-profileanddeftemplate recommendation. Templates must be loaded before any rules that use them. - Rules:
clips/knowledge_base.clpcontainsdefruleforms that assertrecommendationfacts. Keep rules small and focused. - Facts:
clips/sample_facts.clpdemonstrates asserting auser-profilefact and running the engine.
Common CLIPS pitfalls and tips
- Watch multislot binding syntax. Use
$?namein the pattern to bind a multislot and use?nameinside function calls such as(length$ ?name)in constraints. - Ensure parentheses are balanced. Small mismatches lead to parser errors that include a file and line number when using
clipspy. - If you see parser errors in tests, run the diagnostics script to get a contextual snippet:
python scripts\parse_clips.py clips\knowledge_base.clp- Add tests for any changes to
src/inference_engine.pyand for additional CLIPS files you add. - Follow typical Python project conventions: run tests locally, keep commits small, and open PRs for review.
This project is licensed under the MIT License. See the LICENSE file for details.
You are free to:
- ✅ Use this project for any purpose (commercial or personal)
- ✅ Modify and distribute the code
- ✅ Include it in other projects (open source or proprietary)
As long as you:
- 📋 Include a copy of the license and copyright notice
⚠️ Don't hold the authors liable for any issues
- Add
requirements.txtanddev-requirements.txtto pin dependencies.
Demo Video - https://youtu.be/PPeEg0SyTJE?si=LMiBMcMNisgW57xp
- Add a GitHub Actions workflow to run pytest and optionally install
clipspyso CLIPS parser errors are caught on CI. - Add more unit tests for
InferenceEnginecovering edge cases (empty inputs, unexpected types). - Improve the GUI to allow editing facts and re-running the CLIPS engine interactively.