Troubleshooting Guide
This guide covers common issues you might encounter when using DSPy.rb and their solutions.
Language Model Configuration
Error: NoMethodError: undefined method ‘model’ for nil
Problem: This error occurs when a DSPy module doesn’t have a language model configured.
module = DSPy::Predict.new(MySignature)
module.forward(input: "test")
# => NoMethodError: undefined method 'model' for nil
Solution: Configure a language model either globally or at the module level.
# Option 1: Global configuration
DSPy.configure do |config|
config.lm = DSPy::LM.new("openai/gpt-4", api_key: ENV["OPENAI_API_KEY"])
end
# Option 2: Module-level configuration
module = DSPy::Predict.new(MySignature)
module.configure do |config|
config.lm = DSPy::LM.new("anthropic/claude-3", api_key: ENV["ANTHROPIC_API_KEY"])
end
Error: DSPy::ConfigurationError
Problem: Starting from version 0.9.0, DSPy provides clearer error messages when LM is not configured.
DSPy::ConfigurationError: No language model configured for MyModule module.
To fix this, configure a language model either globally:
DSPy.configure do |config|
config.lm = DSPy::LM.new("openai/gpt-4", api_key: ENV["OPENAI_API_KEY"])
end
Or on the module instance:
module_instance.configure do |config|
config.lm = DSPy::LM.new("anthropic/claude-3", api_key: ENV["ANTHROPIC_API_KEY"])
end
Solution: Follow the instructions in the error message to configure an LM.
Gem Conflicts
Warning: ruby-openai gem detected
Problem: DSPy uses the official OpenAI SDK, which conflicts with the community ruby-openai gem.
WARNING: ruby-openai gem detected. This may cause conflicts with DSPy's OpenAI integration.
DSPy uses the official 'openai' gem. The community 'ruby-openai' gem uses the same
OpenAI namespace and will cause conflicts.
Solution: Remove ruby-openai from your Gemfile and use the official gem:
# Gemfile
# Remove this line:
# gem 'ruby-openai'
# DSPy already includes the official gem internally
gem 'dspy'
If you need both gems for different parts of your application, consider isolating them in separate processes or using bundler groups to load them conditionally.
Namespace Conflicts
Problem: Both gems use the OpenAI namespace, causing method conflicts and unexpected behavior.
Solution:
- Use only the official
openaigem that DSPy depends on - If migration is needed, update your code to use the official SDK’s API:
# ruby-openai (old)
client = OpenAI::Client.new(access_token: "key")
response = client.chat(parameters: { model: "gpt-4", messages: [...] })
# official openai SDK (new)
client = OpenAI::Client.new(api_key: "key")
response = client.chat.completions.create(model: "gpt-4", messages: [...])
API Key Issues
Error: DSPyLMMissingAPIKeyError
Problem: API key is not provided for the language model.
Solution: Set the API key via environment variable or parameter:
# Via environment variable
export OPENAI_API_KEY=your-key-here
export ANTHROPIC_API_KEY=your-key-here
export GEMINI_API_KEY=your-key-here
# Via parameter
lm = DSPy::LM.new("openai/gpt-4", api_key: "your-key-here")
JSON Parsing Issues
Error: JSON parsing failures
Problem: LLM returns invalid JSON that can’t be parsed.
Solution: DSPy.rb uses robust extraction strategies that try multiple patterns (code blocks, raw JSON, nested objects). For best reliability, use providers with native structured outputs:
DSPy.configure do |config|
# OpenAI with native structured outputs (recommended)
config.lm = DSPy::LM.new(
"openai/gpt-4o-mini",
api_key: ENV["OPENAI_API_KEY"],
structured_outputs: true
)
# Gemini with native structured outputs (recommended)
# config.lm = DSPy::LM.new(
# "gemini/gemini-2.5-flash",
# api_key: ENV["GEMINI_API_KEY"],
# structured_outputs: true
# )
# Anthropic with tool-based extraction (default, recommended)
# config.lm = DSPy::LM.new(
# "anthropic/claude-sonnet-4-5-20250929",
# api_key: ENV["ANTHROPIC_API_KEY"],
# structured_outputs: true # Default
# )
# Anthropic with enhanced prompting (alternative)
# config.lm = DSPy::LM.new(
# "anthropic/claude-sonnet-4-5-20250929",
# api_key: ENV["ANTHROPIC_API_KEY"],
# structured_outputs: false
# )
end
Provider Support:
- OpenAI/Gemini/Ollama: Use
structured_outputs: truefor native JSON mode - Anthropic:
structured_outputs: true(default) - Tool-based extraction (most reliable)structured_outputs: false- Enhanced prompting extraction
Memory Issues
Error: Memory storage full
Problem: In-memory storage reaches capacity limits.
Solution: Use the MemoryManager to handle memory with automatic compaction:
# Basic memory manager with automatic compaction
manager = DSPy::Memory::MemoryManager.new
# Store a memory (compaction runs automatically when needed)
manager.store_memory(
"User prefers dark mode",
user_id: "user_123",
tags: ["preference", "ui"]
)
# Manually trigger compaction if needed
manager.compact_if_needed!(user_id: "user_123")
# Force compaction (useful for cleanup)
manager.force_compact!(user_id: "user_123")
# Clear memories for a user
manager.clear_memories(user_id: "user_123")
Performance Issues
Slow LLM responses
Problem: API calls taking too long.
Solution:
- Use smaller models for development
- Enable caching for repeated calls
- Use async processing for batch operations
# Use faster model for development
DSPy.configure do |config|
config.lm = DSPy::LM.new("openai/gpt-3.5-turbo") if Rails.env.development?
end
Testing Issues
VCR cassette errors
Problem: Tests fail due to outdated VCR cassettes.
Solution: Re-record cassettes when API changes:
# Delete specific cassette
rm spec/fixtures/vcr_cassettes/my_test.yml
# Re-run test to record new cassette
bundle exec rspec spec/my_test_spec.rb
Common Debugging Tips
- Enable debug logging:
In development, DSPy.rb automatically logs to log/development.log. Simply tail the log file:
tail -f log/development.log
To enable debug level logging with output to stdout:
DSPy.configure do |config|
config.logger = Dry.Logger(:dspy, formatter: :string) do |s|
s.add_backend(level: :debug, stream: $stdout)
end
end
Or redirect logs to stdout using the environment variable:
DSPY_LOG=/dev/stdout ruby your_script.rb
- Check module configuration:
module = DSPy::Predict.new(MySignature) puts module.lm # Should not be nil puts module.config.inspect - Verify API connectivity:
lm = DSPy::LM.new("openai/gpt-4") response = lm.generate("Test prompt") puts response - Use JSON logging for production:
DSPy.configure do |config| config.logger = Dry.Logger(:dspy, formatter: :json) do |s| s.add_backend(stream: $stdout) end end
Getting Help
If you encounter issues not covered here:
- Check the GitHub issues
- Search the documentation
- Create a new issue with:
- Ruby version
- DSPy version
- Minimal reproduction code
- Full error message and stack trace