Skip to content

feat: Add Spark wallet support with Breez Nodeless SDK#841

Open
aljazceru wants to merge 1 commit intocashubtc:mainfrom
nostr-net:breez-spark-backend
Open

feat: Add Spark wallet support with Breez Nodeless SDK#841
aljazceru wants to merge 1 commit intocashubtc:mainfrom
nostr-net:breez-spark-backend

Conversation

@aljazceru
Copy link

  • Adds SparkWallet backend
  • Fixes docker-compose.yaml file to only default to fakewallet if the env variables are not present

@callebtc
Copy link
Collaborator

nice one!

@callebtc callebtc added needs review enhancement New feature or request mint About the Nutshell mint lightning Lightning network labels Nov 30, 2025
assert settings.mint_spark_api_key, "MINT_SPARK_API_KEY not set"
assert settings.mint_spark_mnemonic, "MINT_SPARK_MNEMONIC not set"

network_name = getattr(settings, "mint_spark_network", "mainnet").lower()
Copy link
Contributor

@TheRealCheebs TheRealCheebs Dec 5, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You have these defaults already defined in settings.py:

    mint_spark_network: str = Field(default="mainnet")
    mint_spark_storage_dir: str = Field(default="data/spark")
    mint_spark_connection_timeout: int = Field(default=30)
    mint_spark_retry_attempts: int = Field(default=3)

We should manage them in one place.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ah yeah, good catch, i'll remove the duplicates

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@TheRealCheebs I've finally looked into this properly, technically its following patterns from other backends that have settings define in settings.py but actually validate the needed credentials in their respective files

macaroon = settings.mint_corelightning_rest_macaroon

if not endpoint:

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think those two are showing the pattern you have for the mint_spark_api_key and mint_spark_mnemonic; raising an exception or asserting when nothing is set. What I am calling out is you set defaults in settings.py for mint_spark_network, mint_spark_storage_dir,... , then you have defaults set again in the third parameter of getattr. The getattr defaults should be dropped as they are duplicates, and defaults should be managed in one place.

Comment on lines +12 to +15
- MINT_BACKEND_BOLT11_SAT=${MINT_BACKEND_BOLT11_SAT:-FakeWallet}
- MINT_LISTEN_HOST=${MINT_LISTEN_HOST:-0.0.0.0}
- MINT_LISTEN_PORT=${MINT_LISTEN_PORT:-3338}
- MINT_PRIVATE_KEY=${MINT_PRIVATE_KEY:-TEST_PRIVATE_KEY}
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

unrelated changes?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yeah kinda, it makes the docker file actually work properly (vars taken from .env with sane defaults). otherwise you either need docker-compose.override.yaml or you always have a diff of the file in the repo. so its an improvemen, but it should possibly be in a separate PR :)

COPY . .
RUN poetry config virtualenvs.create false
RUN poetry install --no-dev --no-root

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

?

@ye0man ye0man added this to nutshell Jan 21, 2026
@github-project-automation github-project-automation bot moved this to Backlog in nutshell Jan 21, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

enhancement New feature or request lightning Lightning network mint About the Nutshell mint needs review

Projects

Status: Backlog

Development

Successfully merging this pull request may close these issues.

5 participants