A Model Context Protocol (MCP) server that provides LLM applications with access to Twitter data through the TwitterAPI.io service. This server enables AI assistants like Claude to retrieve and analyze tweets, user profiles, and other Twitter data in a structured way.
- Tweet data by ID (
tweet://{tweet_id}
) - Tweet replies (
tweet://{tweet_id}/replies
) - Tweet retweeters (
tweet://{tweet_id}/retweeters
) - User profiles (
user://{username}
) - User tweets (
user://{username}/tweets
) - User followers (
user://{username}/followers
) - User following (
user://{username}/following
)
- Basic Twitter operations (get tweet, get user profile, search tweets)
- Python 3.8 or higher
- TwitterAPI.io API key
# Install with pip
pip install twitterapi-mcp
# or with uv for better performance
uv pip install twitterapi-mcp
You can use the server directly through uv run by adding to your .mcp.json
file:
"twitterapi-mcp": {
"command": "uv",
"args": [
"run",
"twitterapi-mcp"
],
"env": {
"TWITTER_API_KEY": "your_api_key_here"
}
}
- Clone this repository:
git clone https://github.com/DevRico003/twitterapi.io-mcp.git
cd twitterapi.io-mcp
- Install as development package:
pip install -e .
- Configure your TwitterAPI.io API key using environment variables or a
.env
file:
TWITTER_API_KEY=your_api_key_here
LOG_LEVEL=INFO
CACHE_TTL=3600
MAX_TWEETS=100
Run directly with Python:
# Run the server package as a module
python -m twitterapi_server
Or use the MCP development mode:
mcp dev twitterapi_server
# If you have the package installed:
mcp install -m twitterapi-mcp --name "Twitter API"
# Or directly from the code:
mcp install twitterapi_server --name "Twitter API"
The server supports the following environment variables:
TWITTER_API_KEY
(required): Your TwitterAPI.io API keyLOG_LEVEL
(optional): Logging level (default: INFO)CACHE_TTL
(optional): Cache timeout in seconds (default: 3600/1 hour)MAX_TWEETS
(optional): Maximum tweets per request (default: 100)
.
├── .gitignore
├── LICENSE
├── MANIFEST.in
├── README.md # This file
├── pyproject.toml # Project configuration and dependencies
├── requirements.txt # Dependencies (alternative format)
├── setup.py # Build script (legacy compatibility)
├── twitterapi/ # Main package source code
│ ├── __init__.py
│ ├── api_client.py
│ ├── config.py
│ ├── mcp_server.py
│ ├── utils.py
│ ├── resources/
│ │ ├── __init__.py
│ │ ├── tweet_resources.py
│ │ └── user_resources.py
│ └── tools/
│ ├── __init__.py
│ └── basic_tools.py
└── twitterapi_server/ # Server entry point package
└── __init__.py # Contains main() function
Run the tests with pytest:
## 📦 Publishing to PyPI (Developer Notes)
To publish a new version of this package to PyPI:
1. **Increment Version:** Update the `version` number in `pyproject.toml`.
2. **Install Tools:** Make sure you have the necessary tools installed:
```bash
pip install --upgrade build twine
```
3. **Configure Credentials:** Ensure you have a `.pypirc` file in your home directory (`~/.pypirc`) configured with your PyPI API token or username/password. Example:
```ini
[pypi]
username = __token__
password = pypi-your-api-token-here
```
4. **Build the Package:** Generate the distribution archives:
```bash
python -m build
```
This will create `sdist` (.tar.gz) and `wheel` (.whl) files in the `dist/` directory.
5. **Upload to PyPI:** Upload the generated files using twine:
```bash
# Replace X.Y.Z with the new version number
python -m twine upload dist/twitterapi_mcp-X.Y.Z*
# Or upload all files in dist/ (be careful if old versions exist)
# python -m twine upload dist/*
```
python -m pytest
You can run specific test modules:
python -m pytest tests/test_utils.py
python -m pytest tests/test_api_client.py
TwitterAPI.io charges approximately $0.15 per 1,000 tweets retrieved. This server implements caching with a configurable TTL to reduce API costs while maintaining fresh data. The cache is particularly effective for frequently monitored influencers and popular searches.
This project is licensed under the MIT License - see the LICENSE file for details.