Skip to content

py-lama/jslama

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

12 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

DevLama (formerly JSLama)

AI-powered development assistant that leverages Ollama's language models for code generation and assistance.

🚧 Current Status: Under Development

This project is currently under active development. Some features might not work as expected. Please report any issues you encounter.

πŸš€ Quick Start

  1. Install the required dependencies:

    # Install Node.js (v14+)
    curl -fsSL https://deb.nodesource.com/setup_18.x | sudo -E bash -
    sudo apt-get install -y nodejs
    
    # Install Ollama
    curl -fsSL https://ollama.com/install.sh | sh
    
    # Install DevLama globally
    npm install -g devlama
  2. Start the required services:

    # Start Ollama service
    ollama serve &
    
    # Start getllm API (if using local models)
    # Make sure to install getllm first: pip install getllm
    getllm serve
  3. Test the installation:

    devlama --version

πŸ› οΈ TODO List

High Priority

  • Fix CLI command execution
  • Implement proper error handling for Ollama API calls
  • Add connection test to getllm API
  • Create proper configuration system
  • Add proper logging

Medium Priority

  • Add support for different programming languages
  • Implement context-aware code generation
  • Add tests for all major components
  • Create documentation website
  • Add CI/CD pipeline

Low Priority

  • Add plugin system
  • Implement code review functionality
  • Add support for custom templates
  • Create VS Code extension

πŸ—ΊοΈ Roadmap

v0.2.0 - Core Functionality (Current)

  • Basic CLI interface
  • Integration with Ollama
  • Simple code generation
  • Connection to getllm API
  • Basic error handling

v0.3.0 - Enhanced Features

  • Configuration system
  • Improved error messages
  • Better documentation
  • Basic testing

v0.4.0 - Developer Experience

  • VS Code extension
  • Plugin system
  • Template support
  • Improved logging

v1.0.0 - Stable Release

  • Full test coverage
  • Comprehensive documentation
  • Performance optimizations
  • Community guidelines

πŸ”Œ Integration with getllm

To use DevLama with getllm, make sure the getllm API is running:

# Install getllm if not already installed
pip install getllm

# Start the getllm API server
getllm serve

# In another terminal, you can test the connection
curl http://localhost:8000/health

πŸ› Known Issues

  • Connection to getllm API might fail if the service is not running
  • Some commands might not work as expected in the current version
  • Limited error handling in the current implementation

🀝 Contributing

Contributions are welcome! Please read our Contributing Guidelines for details on how to get started.

πŸ“„ License

This project is licensed under the Apache 2.0 License - see the LICENSE file for details.

npm version License Node.js Version

PyLama Ecosystem Navigation

Project Description Links
DevLama AI-powered development assistant GitHub Β· NPM Β· Docs
GetLLM LLM model management and code generation GitHub Β· PyPI Β· Docs
LogLama Centralized logging and environment management GitHub Β· PyPI Β· Docs
APILama API service for code generation GitHub Β· Docs
BEXY Sandbox for executing generated code GitHub Β· NPM Β· Docs
JSLama JavaScript code generation GitHub Β· NPM Β· Docs
SheLLama Shell command generation GitHub Β· PyPI Β· Docs
WebLama Web application generation GitHub Β· Docs

Author

Tom Sapletta β€” DevOps Engineer & Systems Architect

  • πŸ’» 15+ years in DevOps, Software Development, and Systems Architecture
  • 🏒 Founder & CEO at Telemonit (Portigen - edge computing power solutions)
  • 🌍 Based in Germany | Open to remote collaboration
  • πŸ“š Passionate about edge computing, hypermodularization, and automated SDLC

GitHub LinkedIn ORCID Portfolio

Support This Project

If you find this project useful, please consider supporting it:

Installation

npm install -g devlama  # For global CLI usage
# or
yarn global add devlama

Quick Start

Command Line Usage

# Initialize a new project
devlama init my-project

# Generate code from a prompt
devlama generate "Create a React component that displays a counter"

# Start interactive mode
devlama

# Show version
devlama --version

Programmatic Usage

const { DevLama } = require('devlama');

const devlama = new DevLama({
  model: 'codellama',  // Default model
  temperature: 0.7,
});

// Generate code from a prompt
const code = await devlama.generateCode('Create a function that sorts an array of objects by a property');
console.log(code);

Features

  • AI-powered code generation and assistance
  • Support for multiple programming languages
  • Integration with Ollama's language models
  • Interactive REPL for development
  • Configurable model parameters
  • Project scaffolding and management JSLama.generate(prompt).then(code => { console.log(code); });

## Testing

To run tests for JSLama using the PyLama ecosystem:

```bash
cd ../../tests
./run_all_tests.sh
# or for a tolerant run
./run_all_tests_tolerant.sh

Or, from the jslama directory:

make test

Project Management

Common Makefile commands:

  • make install – Install dependencies
  • make lint – Lint code
  • make test – Run tests
  • make build – Build project
  • make clean – Clean build/deps
  • make format – Format code
  • make start – Start project (if supported)

Example: Code Generation with JSLama

const JSLama = require('jslama');

JSLama.generate('Write a function to reverse a string.').then(code => {
  console.log(code);
  // Output: function reverseString(str) { return str.split('').reverse().join(''); }
});

JSLama is a JavaScript code generation tool that leverages Ollama's language models. It is part of the PyLama ecosystem and integrates with LogLama as the primary service for centralized logging and environment management.

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published