103 lines
3.3 KiB
Markdown
103 lines
3.3 KiB
Markdown
# Claude Code PowerShell Python App
|
|
|
|
A sophisticated PowerShell wrapper application that provides coding assistance in the style of Claude Code, using Qwen3-Coder models with support for both LM Studio server and direct model loading.
|
|
|
|
## Files Created
|
|
- `lm_studio_client.py` - Enhanced Python client script with Qwen3-Coder features
|
|
- `lm_studio.ps1` - PowerShell wrapper script
|
|
- `README.md` - This documentation
|
|
|
|
## Prerequisites
|
|
1. Python 3.7+ installed and in PATH
|
|
2. For LM Studio: LM Studio running with server enabled on http://127.0.0.1:1234
|
|
3. For Qwen direct: transformers and torch libraries
|
|
4. Python requests library (auto-installed by script)
|
|
5. Optional: Flask for web interface
|
|
|
|
## Usage
|
|
|
|
### PowerShell Commands:
|
|
|
|
**List available models (LM Studio only):**
|
|
```powershell
|
|
.\lm_studio.ps1 -ListModels
|
|
```
|
|
|
|
**Single prompt:**
|
|
```powershell
|
|
.\lm_studio.ps1 -Prompt "Write a Python function to sort a list"
|
|
```
|
|
|
|
**Interactive chat mode:**
|
|
```powershell
|
|
.\lm_studio.ps1 -Interactive
|
|
```
|
|
|
|
**With specific language focus:**
|
|
```powershell
|
|
.\lm_studio.ps1 -Interactive -Language python
|
|
```
|
|
|
|
**Using Qwen direct model:**
|
|
```powershell
|
|
.\lm_studio.ps1 -Client qwen -Prompt "Hello"
|
|
```
|
|
|
|
**Fill-in-the-middle code completion:**
|
|
```powershell
|
|
.\lm_studio.ps1 -Client qwen -FimPrefix "def sort_list(arr):" -FimSuffix "return sorted_arr"
|
|
```
|
|
|
|
**Start web interface:**
|
|
```powershell
|
|
.\lm_studio.ps1 -Web -Port 8080
|
|
```
|
|
|
|
**Start terminal user interface:**
|
|
```powershell
|
|
.\lm_studio.ps1 -Tui
|
|
```
|
|
|
|
### Direct Python Usage:
|
|
|
|
```bash
|
|
python lm_studio_client.py --help
|
|
python lm_studio_client.py --client qwen --prompt "Create a REST API"
|
|
python lm_studio_client.py --interactive
|
|
python lm_studio_client.py --client qwen --fim-prefix "def hello():" --fim-suffix "print('world')"
|
|
python lm_studio_client.py --web --port 5000
|
|
python lm_studio_client.py --tui
|
|
```
|
|
|
|
## Features
|
|
- **Dual Client Support**: LM Studio server or direct Qwen3-Coder model loading
|
|
- **Interactive Chat**: Real-time conversation with the AI coding assistant
|
|
- **Terminal User Interface**: Curses-based TUI for interactive chat
|
|
- **Fill-in-the-Middle**: Advanced code completion for partial code snippets
|
|
- **Language-Specific Assistance**: Focus on specific programming languages
|
|
- **Web Interface**: Modern web UI with tabs for different features
|
|
- **Model Selection**: Choose from available models
|
|
- **Auto-Dependency Installation**: Automatically installs required Python packages
|
|
- **Error Handling**: Robust error handling and validation
|
|
|
|
## Qwen3-Coder Features
|
|
- **Agentic Coding**: Advanced coding capabilities with tool use
|
|
- **Long Context**: Support for up to 256K tokens
|
|
- **358 Programming Languages**: Comprehensive language support
|
|
- **Fill-in-the-Middle**: Specialized code completion
|
|
- **Function Calling**: Tool integration capabilities
|
|
|
|
## Setup
|
|
1. For LM Studio: Ensure LM Studio is running with server enabled
|
|
2. For Qwen direct: Install transformers: `pip install transformers torch`
|
|
3. For web interface: Install Flask: `pip install flask`
|
|
4. Run the PowerShell script from the same directory
|
|
5. The script will auto-install required Python dependencies
|
|
|
|
## Web Interface
|
|
The web interface provides three main tabs:
|
|
- **Chat**: Interactive conversation with the AI
|
|
- **Fill-in-the-Middle**: Code completion for partial snippets
|
|
- **Code Assistant**: Generate code from descriptions
|
|
|
|
Access at `http://localhost:5000` (or custom port) |