╔══════════════════════════════════════════════════════════════════════════════╗
║ INTEGRATED COGNITIVE INFERENCE SYSTEM - DOCUMENTATION ║
╚══════════════════════════════════════════════════════════════════════════════╝
This is a complete Cognitive AI Pipeline that transforms natural language text into structured knowledge, processes it through logical inference, and enriches it with emotional analysis.
Natural Language Text
↓
[STAGE 1: NLP Extraction]
↓
Structured Facts (.inf format)
↓
[STAGE 2: Inference Engine]
↓
Natural Language Inferences
↓
[STAGE 3: Emotional Analysis]
↓
Emotionally Enriched Knowledge
-
Install Ollama
# Visit https://ollama.ai/ and install Ollama # Then download the Gemma model: ollama pull gemma:2b
-
Install Python Dependencies
pip install requests
-
Have the inference engine ready
- Make sure
engine.pyis in the same directory
- Make sure
python integrated_system.pyInference/
├── engine.py # Core inference engine
├── nlp_to_inference.py # NLP → Structured facts converter
├── emotional_analyzer.py # Emotional analysis module
├── integrated_system.py # Main integrated pipeline
├── test.inf # Sample inference file
├── README.md # Engine documentation
└── COGNITIVE_SYSTEM_README.md # This file
Purpose: Converts natural language sentences into structured inference format.
Input:
Pedro is an excellent student who lives in Madrid.
Bob is the father of Pedro and works at Microsoft.
Output (.inf format):
(Pedro)IsA(student)
(Pedro)LivesIn(Madrid)
(Bob)FatherOf(Pedro)
(Bob)WorksAt(Microsoft)
Usage:
from nlp_to_inference import NLPToInferenceConverter
converter = NLPToInferenceConverter()
converter.convert_and_save("Your text here...", "output.inf")Purpose: Processes structured facts and generates natural language inferences with automatic concatenation.
Input (.inf format):
(Pedro)IsA(student)
(Pedro)LivesIn(Madrid)
(Bob)FatherOf(Pedro)
Output:
Pedro is a student and lives in Madrid.
Bob father of Pedro.
Usage:
from engine import InferenceEngine
engine = InferenceEngine()
inferences = engine.load_file("input.inf")
for inference in inferences:
print(inference)Purpose: Analyzes inferences for emotional content and sentiment.
Input:
Pedro is a student and lives in Madrid.
The impact was terrible and broke the wall.
Pedro felt very happy about his success.
Output:
Pedro is a student and lives in Madrid.
→ Emotion: Neutral, Sentiment: Neutral
The impact was terrible and broke the wall.
→ Emotion: Sadness, Sentiment: Negative
Pedro felt very happy about his success.
→ Emotion: Joy, Sentiment: Positive
Usage:
from emotional_analyzer import EmotionalAnalyzer
analyzer = EmotionalAnalyzer()
analyzer.analyze_and_save("inferences.txt", "emotional_analysis.txt")Purpose: Runs the complete pipeline automatically.
Example:
from integrated_system import IntegratedCognitiveSystem
system = IntegratedCognitiveSystem()
text = """
Pedro is an excellent student who lives in Madrid.
Bob is his father and works at Microsoft.
"""
system.process_and_save(text, "results.txt")from integrated_system import IntegratedCognitiveSystem
system = IntegratedCognitiveSystem()
text = """
Marco is a talented teacher who loves helping students.
He teaches Python to Pedro with great enthusiasm.
Pedro feels very happy about learning programming.
"""
results = system.process_text(text, verbose=True)
# Access different parts of the analysis
print("Structured Facts:", results['structured_facts'])
print("Natural Inferences:", results['natural_inferences'])
print("Emotional Analysis:", results['emotional_analysis'])
print("Summary:", results['summary'])from integrated_system import IntegratedCognitiveSystem
system = IntegratedCognitiveSystem()
story = """
Once upon a time, there was a young boy named Pedro. Pedro was an
excellent student who lived in the beautiful city of Madrid. He studied
Computer Science and was very passionate about programming. His father,
Bob, was an experienced programmer who worked at Microsoft. Bob was very
proud of his son. One day, Marco, Pedro's teacher, taught him Python
programming. Pedro was thrilled and excited to learn. However, later that
day, while playing football, Pedro kicked the ball too hard. The ball
impacted the wall violently and broke it. Pedro felt terrible about the
incident and became very sad.system.process_and_save(story, "story_analysis.txt")
### Example 3: Batch Processing
```python
from integrated_system import IntegratedCognitiveSystem
system = IntegratedCognitiveSystem()
texts = [
"Bob is a programmer who lives in Seattle.",
"Marco teaches Python to students.",
"Pedro feels happy about his achievements."
]
for i, text in enumerate(texts, 1):
results = system.process_text(text, verbose=False)
print(f"\n--- Text {i} ---")
for inf in results['natural_inferences']:
emotion = results['emotional_analysis'][0]['emotion']
sentiment = results['emotional_analysis'][0]['sentiment']
print(f"{inf} [{emotion}, {sentiment}]")
The integrated system generates a comprehensive report with three stages:
(Pedro)IsA(student)
(Pedro)LivesIn(Madrid)
(Bob)FatherOf(Pedro)
Pedro is a student and lives in Madrid.
Bob father of Pedro.
Pedro is a student and lives in Madrid.
→ Emotion: Neutral, Sentiment: Neutral
Bob father of Pedro.
→ Emotion: Neutral, Sentiment: Neutral
Total Sentences: 2
Emotions:
• Neutral: 2 (100.0%)
Sentiments:
• Neutral: 2 (100.0%)
- Joy: Happiness, excitement, pleasure
- Sadness: Sorrow, disappointment, grief
- Anger: Frustration, irritation, rage
- Fear: Anxiety, worry, terror
- Surprise: Astonishment, amazement
- Disgust: Revulsion, distaste
- Neutral: No strong emotion
- Positive: Favorable, good, pleasant
- Negative: Unfavorable, bad, unpleasant
- Neutral: Neither positive nor negative
system = IntegratedCognitiveSystem(model_name="gemma:7b") # Use larger modelEdit the temperature parameter in nlp_to_inference.py or emotional_analyzer.py:
"temperature": 0.3 # Higher = more creative, Lower = more deterministicSolution: Make sure Ollama is running:
ollama serveSolution: Download the model:
ollama pull gemma:2bSolution: Try a larger model:
ollama pull gemma:7b
# Then use: IntegratedCognitiveSystem(model_name="gemma:7b")Potential improvements:
- Query System: Ask questions about the knowledge base
- Relationship Graph: Visualize connections between entities
- Temporal Reasoning: Track changes over time
- Contradiction Detection: Identify conflicting facts
- Multi-language Support: Process text in Spanish, French, etc.
- Confidence Scores: Assign certainty levels to inferences
- Interactive Mode: Real-time processing with user feedback
Open source - Educational and research use.
Marco
- Integrated Cognitive System: October 2025
╔══════════════════════════════════════════════════════════════════════════════╗
║ Thank you for using the Integrated Cognitive System! ║
║ Knowledge • Inference • Emotion ║
╚══════════════════════════════════════════════════════════════════════════════╝