Skip to content

Create an MCP server utilising OpenGIN's Read API endpoints#94

Merged
zaeema-n merged 16 commits intoLDFLK:mainfrom
zaeema-n:add_mcp
Mar 23, 2026
Merged

Create an MCP server utilising OpenGIN's Read API endpoints#94
zaeema-n merged 16 commits intoLDFLK:mainfrom
zaeema-n:add_mcp

Conversation

@zaeema-n
Copy link
Member

This PR creates an MCP server which utilises OpenGIN's Read API endpoints. Instructions have been included to use it with Claude.

This PR closes #91

@zaeema-n zaeema-n requested a review from ChanukaUOJ March 20, 2026 08:25
@gemini-code-assist
Copy link
Contributor

Summary of Changes

Hello, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed!

This pull request introduces a Model Context Protocol (MCP) server designed to facilitate programmatic access for Large Language Models (LLMs) to the OpenGIN Knowledge Graph. The server provides a structured interface, allowing LLMs to perform complex queries on government structures, ministerial appointments, and tabular datasets from Sri Lanka's public records. This significantly enhances the LLM's capability to interact with and extract insights from structured data sources.

Highlights

  • OpenGIN MCP Server Implementation: An MCP server was implemented to act as a bridge between Large Language Models (LLMs) and the OpenGIN Read API, enabling AI assistants to query the OpenGIN Knowledge Graph.
  • New API Client and Tools: A dedicated HTTP client (mcp/client.py) was added to interact with the OpenGIN Read API, along with four core tools: search_entities, get_entity_metadata, get_entity_attribute, and get_entity_relations.
  • Knowledge Graph Schema Exposure: A new resource, opengin://schema, was introduced to expose the complete OpenGIN Knowledge Graph schema, including entity kinds and relationship types, allowing LLMs to dynamically understand the graph structure.
  • LLM Integration Instructions: Detailed instructions and configuration examples were provided in the README.md for integrating the new MCP server with Claude Desktop.
  • Protobuf Decoding Logic: Robust logic was developed within the API client to decode protobuf-wrapped names and attribute values, ensuring proper interpretation of data returned by the OpenGIN API.
Using Gemini Code Assist

The full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips.

Invoking Gemini

You can request assistance from Gemini at any point by creating a comment using either /gemini <command> or @gemini-code-assist <command>. Below is a summary of the supported commands on the current page.

Feature Command Description
Code Review /gemini review Performs a code review for the current pull request in its current state.
Pull Request Summary /gemini summary Provides a summary of the current pull request in its current state.
Comment @gemini-code-assist Responds in comments when explicitly tagged, both in pull request comments and review comments.
Help /gemini help Displays a list of available commands.

Customization

To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a .gemini/ folder in the base of the repository. Detailed instructions can be found here.

Limitations & Feedback

Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for GitHub and other Google products, sign up here.

Footnotes

  1. Review the Privacy Notices, Generative AI Prohibited Use Policy, Terms of Service, and learn how to configure Gemini Code Assist in GitHub here. Gemini can make mistakes, so double check it and use code with caution.

Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request introduces a new MCP server to interact with the OpenGIN Read API. The implementation is well-structured, with clear separation of concerns between the API client, server entry point, tools, and resources. The code is generally clean and includes good documentation.

My review focuses on improving robustness and performance. Specifically, I've suggested changes to handle potential configuration errors gracefully, improve HTTP client performance by reusing connections, and refine exception handling to be more specific. I also found a minor discrepancy in a code comment and some dead code that can be removed.

Copy link
Member

@ChanukaUOJ ChanukaUOJ left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Initial Commits!

Copy link
Member

@ChanukaUOJ ChanukaUOJ left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!

@zaeema-n zaeema-n merged commit 619e79b into LDFLK:main Mar 23, 2026
1 check passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Implement an MCP architecture

2 participants