From 6bcb3d68b0def521679dedf369b134a3a85cc8b0 Mon Sep 17 00:00:00 2001
From: lisancao
Date: Mon, 27 Apr 2026 14:12:42 -0700
Subject: [PATCH] Update redirected links in contributing.md to canonical URLs
Five external links in contributing.md (and the rendered site/contributing.html)
currently return 200 only via redirect:
- help.github.com (deprecated) to docs.github.com (2 occurrences)
- oracle.com/technetwork (deprecated) to oracle.com/java/technologies (1)
- http://legacy.python.org/dev/peps to https://peps.python.org (1)
- http://docs.scala-lang.org to https://docs.scala-lang.org (1)
Each is updated to its current canonical URL.
Generated-by: Claude Opus 4.7 (1M context) (claude-opus-4-7)
---
contributing.md | 10 +++++-----
site/contributing.html | 10 +++++-----
2 files changed, 10 insertions(+), 10 deletions(-)
diff --git a/contributing.md b/contributing.md
index 1d68b76959..a82b1952a5 100644
--- a/contributing.md
+++ b/contributing.md
@@ -359,7 +359,7 @@ and every run burdens the limited resources of GitHub Actions in Apache Spark re
Below steps will take your through the process.
-1. Fork the GitHub repository at
+1. Fork the GitHub repository at
https://github.com/apache/spark if you haven't already
1. Go to "Actions" tab on your forked repository and enable "Build and test" and "Report test results" workflows
1. Clone your fork and create a new branch
@@ -400,7 +400,7 @@ passes style checks.
If style checks fail, review the Code Style Guide below.
1. Push commits to your branch. This will trigger "Build and test" and "Report test results" workflows
on your forked repository and start testing and validating your changes.
-1. Open a pull request against
+1. Open a pull request against
the `master` branch of `apache/spark`. (Only in special cases would the PR be opened against other branches). This
will trigger workflows "On pull request*" (on Spark repo) that will look/watch for successful workflow runs on "your" forked repository (it will wait if one is running).
1. The PR title should be of the form `[SPARK-xxxx][COMPONENT] Title`, where `SPARK-xxxx` is
@@ -473,17 +473,17 @@ resolve the JIRA.
Please follow the style of the existing codebase.
- For Python code, Apache Spark follows
-PEP 8 with one exception:
+PEP 8 with one exception:
lines can be up to 100 characters in length, not 79.
- For R code, Apache Spark follows
Google's R Style Guide with three exceptions:
lines can be up to 100 characters in length, not 80, there is no limit on function name but it has a initial
lower case letter and S4 objects/methods are allowed.
- For Java code, Apache Spark follows
-Oracle's Java code conventions and
+Oracle's Java code conventions and
Scala guidelines below. The latter is preferred.
- For Scala code, Apache Spark follows the official
-Scala style guide and
+Scala style guide and
Databricks Scala guide. The latter is preferred. To format Scala code, run ./dev/scalafmt prior to submitting a PR.
If in doubt
diff --git a/site/contributing.html b/site/contributing.html
index f5c4555006..4d53d54dbd 100644
--- a/site/contributing.html
+++ b/site/contributing.html
@@ -551,7 +551,7 @@ Pull request
Below steps will take your through the process.
- - Fork the GitHub repository at
+
- Fork the GitHub repository at
https://github.com/apache/spark if you haven’t already
- Go to “Actions” tab on your forked repository and enable “Build and test” and “Report test results” workflows
- Clone your fork and create a new branch
@@ -598,7 +598,7 @@ Pull request
If style checks fail, review the Code Style Guide below.
- Push commits to your branch. This will trigger “Build and test” and “Report test results” workflows
on your forked repository and start testing and validating your changes.
- - Open a pull request against
+
- Open a pull request against
the
master branch of apache/spark. (Only in special cases would the PR be opened against other branches). This
will trigger workflows “On pull request*” (on Spark repo) that will look/watch for successful workflow runs on “your” forked repository (it will wait if one is running).
@@ -686,17 +686,17 @@ Code style guide
- For Python code, Apache Spark follows
-PEP 8 with one exception:
+PEP 8 with one exception:
lines can be up to 100 characters in length, not 79.
- For R code, Apache Spark follows
Google’s R Style Guide with three exceptions:
lines can be up to 100 characters in length, not 80, there is no limit on function name but it has a initial
lower case letter and S4 objects/methods are allowed.
- For Java code, Apache Spark follows
-Oracle’s Java code conventions and
+Oracle’s Java code conventions and
Scala guidelines below. The latter is preferred.
- For Scala code, Apache Spark follows the official
-Scala style guide and
+Scala style guide and
Databricks Scala guide. The latter is preferred. To format Scala code, run ./dev/scalafmt prior to submitting a PR.