feat: add MiniMax as third LLM provider in Chapter 10#33
Open
octo-patch wants to merge 1 commit intoVersusControl:mainfrom
Open
feat: add MiniMax as third LLM provider in Chapter 10#33octo-patch wants to merge 1 commit intoVersusControl:mainfrom
octo-patch wants to merge 1 commit intoVersusControl:mainfrom
Conversation
Add MiniMax (https://www.minimax.io) as a third LLM provider alongside Google Gemini and GitHub Models in the AI Agent tutorial (Chapter 10). MiniMax exposes an OpenAI-compatible API, so it reuses ChatOpenAI from langchain-openai with a custom base_url -- the same pattern as GitHub Models. Temperature is clamped to 0.01 minimum (MiniMax requires > 0). Changes: - New MiniMaxModel wrapper (src/models/minimax.py) - Updated model factory to support LLM_PROVIDER=minimax - Added MINIMAX_API_KEY/MODEL/ENDPOINT config vars with validation - Updated .env.example, README, sidebar UI, and chapter tutorial text - 21 unit tests + 3 integration tests Co-Authored-By: Octopus <liyuan851277048@icloud.com>
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
Adds MiniMax as a third LLM provider alongside Google Gemini and GitHub Models in the AI Agent for DevOps tutorial (Chapter 10).
MiniMax exposes an OpenAI-compatible API, so it reuses
ChatOpenAIfromlangchain-openaiwith a custombase_url— the same pattern already established for GitHub Models. This gives readers another provider option they can switch to with a singleLLM_PROVIDER=minimaxenvironment variable.Changes (10 files, 543 additions)
src/models/minimax.py— MiniMaxModel wrapper with temperature clamping (MiniMax requires > 0)src/models/factory.py— addedminimaxto provider dispatchsrc/models/__init__.py— exports MiniMaxModelsrc/config.py— addedMINIMAX_API_KEY,MINIMAX_MODEL,MINIMAX_ENDPOINTconfig vars with validation.env.example— documented MiniMax env varsapp.py— sidebar now shows MiniMax provider labelREADME.md— added MiniMax to provider table and project structure10-building-complex-agent-with-actions.md— added MiniMax code example, config, and validation in tutorial texttests/test_minimax.py— 21 unit tests (model init, temperature clamping, factory, config validation, exports)tests/test_minimax_integration.py— 3 integration tests (chat completion, tool binding, factory)Models supported
MiniMax-M2.7(default)MiniMax-M2.7-highspeedTest plan
python -m pytest tests/test_minimax.py)LLM_PROVIDER=minimaxworks withstreamlit run app.py