A Model Context Protocol (MCP) server that allows MCP clients (like Cursor) to execute Mathematica code via wolframscript and verify mathematical derivations.
Join us in accelerating scientific discovery with AI and open-source tools!
Running any server in this repository is as simple as a single command. For example, to start the web-fetch
server:
uvx mcp-science web-fetch
This command handles everything from installation to execution. For more details on configuration and finding other servers, see the "How to configure MCP servers for AI client apps" section below.
This repository contains a collection of open source MCP servers specifically designed for scientific research applications. These servers enable Al models (like Claude) to interact with scientific data, tools, and resources through a standardized protocol.
MCP is an open protocol that standardizes how applications provide context to LLMs. Think of MCP like a USB-C port for AI applications. Just as USB-C provides a standardized way to connect your devices to various peripherals and accessories, MCP provides a standardized way to connect AI models to different data sources and tools.
MCP helps you build agents and complex workflows on top of LLMs. LLMs frequently need to integrate with data and tools, and MCP provides:
- A growing list of pre-built integrations that your LLM can directly plug into
- The flexibility to switch between LLM providers and vendors
- Best practices for securing your data within your infrastructure
Below is a complete list of the MCP servers that live in this monorepo. Every entry links to the sub-directory that contains the server’s source code and README so that you can find documentation and usage instructions quickly.
An example MCP server that demonstrates the minimal pieces required for a server implementation.
A specialised MCP server that enables AI assistants to search, visualise and manipulate materials-science data from the Materials Project database. A Materials Project API key is required.
Runs Python code snippets in a secure, sandboxed environment with restricted standard-library access so that assistants can carry out analysis and computation without risking your system.
Allows an assistant to run pre-validated commands on remote machines over SSH with configurable authentication and command whitelists.
Fetches and processes HTML, PDF and plain-text content from the Web so that the assistant can quote or summarise it.
Performs Web, academic and “best effort” searches via the TXYZ API. A TXYZ API key is required.
Provides density-functional-theory (DFT) calculations through the GPAW package.
Lets an assistant interact with a running Jupyter kernel, executing notebook cells programmatically.
Evaluates small snippets of Wolfram Language code through a headless Mathematica instance.
Neuroscience Model Analysis Dashboard server that exposes tools for inspecting NEMAD data-sets.
Provides CRUD access to a lightweight JSON database backed by TinyDB so that an assistant can store and retrieve small pieces of structured data.
If you're not familiar with these stuff, here is a step-by-step guide for you: Step-by-step guide to configure MCP servers for AI client apps
uv — a super-fast (Rust-powered) drop-in replacement for pip + virtualenv. Install it with:
curl -sSf https://astral.sh/uv/install.sh | bash
An MCP-enabled client application such as Claude Desktop, VSCode, Goose, 5ire.
uvx
Any server in this repository can be launched with a single shell command. The pattern is:
uvx mcp-science <server-name>
For example, to start the web-fetch
stdio server locally, configure the following command in your client:
uvx mcp-science web-fetch
Which corresponds to this in claude desktop's json configuration:
{
"mcpServers": {
"web-fetch": {
"command": "uvx",
"args": [
"mcp-science",
"web-fetch"
]
}
}
}
The command will download the mcp-science
package from PyPI and run the requested entry-point.
Have a look at the Available servers list — every entry in the table works with the pattern shown above.
MCPM is a convenience command-line tool that can automate the process of wiring servers into supported clients. It is not required but can be useful if you frequently switch between clients or maintain a large number of servers.
The basic workflow is:
# Install mcpm first – it is a separate project
uv pip install mcpm
mcpm client ls # discover supported clients
mcpm client set <name> # pick the one you are using
# Add a server (automatically installing it if needed)
mcpm add web-fetch
After the command finishes, restart your client so that it reloads its tool configuration. You can browse the MCPM Registry for additional community-maintained servers.
Please check How to build your own MCP server step by step for more details.
We enthusiastically welcome contributions to MCP.science! You can help with improving the existing servers, adding new servers, or anything that you think will make this project better.
If you are not familiar with GitHub and how to contribute to a open source repository, then it might be a bit of challenging, but it's still easy for you. We would recommend you to read these first:
In short, you can follow these steps:
Fork the repository to your own GitHub account
Clone the forked repository to your local machine
Create a feature branch (git checkout -b feature/amazing-feature
)
Make your changes and commit them (git commit -m 'Add amazing feature'
)
Please create your new server in the servers
folder.
For creating a new server folder under repository folder, you can simply run (replace your-new-server
with your server name)
uv init --package --no-workspace servers/your-new-server
uv add --directory servers/your-new-server mcp
This will create a new server folder with the necessary files:
servers/your-new-server/
├── README.md
├── pyproject.toml
└── src
└── your_new_server
└── __init__.py
You may find there are 2 related names you might see in the config files:
pyproject.toml
, e.g. your-new-server
.src/
, e.g. your_new_server
.Push to the branch (git push origin feature/amazing-feature
)
Open a Pull Request
Please make sure your PR adheres to:
If you want to recognize contributors for a specific server/subrepo (e.g. servers/gpaw-computation/
), you can use the All Contributors CLI in that subdirectory.
Steps:
servers/gpaw-computation/
), create a .all-contributorsrc
file (see example).npx all-contributors add <github-username> <contribution-type>
README.md
:
npx all-contributors generate
README.md
and .all-contributorsrc
.For more details, see the All Contributors CLI installation guide.
This project is licensed under the MIT License - see the LICENSE file for details.
Thanks to all contributors!
For general use, please cite this repository as described in the root CITATION.cff.
If you use a specific server/subproject, please see the corresponding CITATION.cff
file in that subproject's folder under servers/
for the appropriate citation.
by: GLips
Give your coding agent direct access to Figma file data, helping it one-shot design implementation.
by: kapilduraphe
Interfact with the Webflow APIs
by: Laksh-star
This MCP server integrates with The Movie Database (TMDB) API to provide movie information, search capabilities, and recommendations.
by: r-huijts
Access Dutch Railways (NS) real-time train travel information and disruptions through the official NS API.