Port details |
- spark Fast big data processing engine
- 3.3.4 devel =2 3.3.4Version of this port present on the latest quarterly branch.
- Maintainer: freebsd@sysctl.cz
- Port Added: 2014-12-20 18:34:31
- Last Update: 2024-11-14 20:58:49
- Commit Hash: 0bb9466
- People watching this port, also watch:: jdictionary, py311-Automat, py311-python-gdsii, py39-PyOpenGL, p5-Sane
- Also Listed In: java
- License: APACHE20
- WWW:
- http://spark.apache.org/
- Description:
- Apache Spark is a fast and general-purpose cluster computing system. It
provides high-level APIs in Java, Scala and Python, and an optimized engine
that supports general execution graphs. It also supports a rich set of
higher-level tools including Spark SQL for SQL and structured data processing,
MLlib for machine learning, GraphX for graph processing, and Spark Streaming.
- ¦ ¦ ¦ ¦
- Manual pages:
- FreshPorts has no man page information for this port.
- pkg-plist: as obtained via:
make generate-plist - Dependency lines:
-
- apache-spark>0:devel/spark
- To install the port:
- cd /usr/ports/devel/spark/ && make install clean
- To add the package, run one of these commands:
- pkg install devel/spark
- pkg install apache-spark
NOTE: If this package has multiple flavors (see below), then use one of them instead of the name specified above.- PKGNAME: apache-spark
- Flavors: there is no flavor information for this port.
- distinfo:
- TIMESTAMP = 1721161093
SHA256 (spark-3.3.4.tgz) = 0b86dd8a61d317523c2e116cd11f65ed6f87ad69bf564819446ed74569c49f4c
SIZE (spark-3.3.4.tgz) = 29214872
Packages (timestamps in pop-ups are UTC):
- Dependencies
- NOTE: FreshPorts displays only information on required and default dependencies. Optional dependencies are not covered.
- Build dependencies:
-
- libsnappyjava.so : archivers/snappy-java
- bash : shells/bash
- maven>0 : devel/maven
- java : java/openjdk8
- python3.11 : lang/python311
- Test dependencies:
-
- python3.11 : lang/python311
- Runtime dependencies:
-
- bash : shells/bash
- java : java/openjdk8
- python3.11 : lang/python311
- There are no ports dependent upon this port
Configuration Options:
- No options to configure
- Options name:
- devel_spark
- USES:
- cpe python shebangfix
- FreshPorts was unable to extract/find any pkg message
- Master Sites:
|
Commit History - (may be incomplete: for full details, see links to repositories near top of page) |
Commit | Credits | Log message |
3.3.4 14 Nov 2024 20:58:49 |
Vladimir Druzenko (vvd) Author: Martin Filla |
devel/spark: fixed typo about BROKEN_i386
PR: 282762
Fixes: a6250ad195d9 (Marked i386 as broken)
MFH: 2024Q4 |
3.3.4 29 Oct 2024 17:56:23 |
Vladimir Druzenko (vvd) Author: Martin Filla |
devel/spark: Marked i386 as broken
Not enough memory on this architecture to build the port.
PR: 282310
MFH: 2024Q4 |
3.3.4 17 Jul 2024 20:13:35 |
Vladimir Druzenko (vvd) |
devel/spark: update 3.3.0 → 3.3.4
Release notes:
https://spark.apache.org/releases/spark-release-3-3-1.html
https://spark.apache.org/releases/spark-release-3-3-2.html
https://spark.apache.org/releases/spark-release-3-3-3.html
https://spark.apache.org/releases/spark-release-3-3-4.html
Also fix build - pkg-plist.
PR: 280325 280333
Approved by: Martin Filla <freebsd@sysctl.cz> (maintainer)
MFH: 2024Q3 |
3.3.0 13 Jun 2024 10:44:56 |
Dirk Meyer (dinoex) |
devel/spark: lowering the memory to 1gb for i386 and armv*
PR: 266902 |
3.3.0 11 Mar 2024 09:09:45 |
Michael Osipov (michaelo) |
*/*: properly depend on Maven package
devel/maven and devel/maven39 do not provide mvn(1) from LOCALBASE that
one comes from devel/maven-wrapper instead. Therefore, one should depend
on the package rather than a non-existing executable/script.
Approved by: jrm (mentor), otis (mentor), vvd
Differential Revision: https://reviews.freebsd.org/D44229 |
3.3.0 29 Sep 2022 13:06:06 |
Stefan Eßer (se) |
devel/spark: fix multiple issues
The devel/spark port had been resurrected by Neel Chauhan based on
PR 266484, but that PR had become stale and did not follow current
rules and conventions (and had some issues that needed to be fixed).
Neel has run out of time fixing the issues and I have taken over and
finished the patches he had been working on.
This updated port has been tested with poudriere testport, but I do
not have a suitable test environment to run functional tests on.
Please address run-time issues that are detected to both the
maintainer of the port and to me.
Approved by: portmgr (blanket) |
3.3.0 19 Sep 2022 15:29:28 |
Neel Chauhan (nc) Author: Martin Filla |
devel/spark: Revive port
PR: 266484 |
2.1.1_2 31 Mar 2022 20:52:40 |
Rene Ladan (rene) |
cleanup: Remove expired ports:
2022-03-31 devel/spark: Depends on expired devel/maven33
This should fix INDEX again.
There are PRs to update this port and import hadoop3. |
2.1.1_2 31 Mar 2022 20:47:24 |
Yuri Victorovich (yuri) |
devel/spark: Release maintainership |
2.1.1_2 15 Jan 2022 11:18:44 |
Rene Ladan (rene) |
devel/spark: mark for expiration, depends on expired devel/maven33 |
2.1.1_2 18 Oct 2021 19:25:52 |
Stefan Eßer (se) |
devel/spark: Add CPE information
Approved by: portmgr (blanket) |
2.1.1_2 12 Oct 2021 08:39:19 |
Daniel Engberg (diizzy) |
devel/spark: Update MASTER_SITES
Approved by: rene (portmgr blanket), arrowd (mentor)
Differential Revision: https://reviews.freebsd.org/D32410 |
2.1.1_2 06 Apr 2021 14:31:07 |
Mathieu Arnold (mat) |
Remove # $FreeBSD$ from Makefiles. |
2.1.1_2 14 Aug 2020 07:21:25 |
yuri |
devel/spark: Remove 2.7 restriction from python |
2.1.1_1 23 Feb 2020 15:25:53 |
antoine |
Deprecate a few ports
With hat: portmgr |
2.1.1_1 26 Nov 2019 21:46:13 |
jkim |
Clean up after java/openjdk6 and java/openjdk6-jre removal
java/openjdk6 support was removed from Mk/bsd.java.mk (r512662) and
java/openjdk6 and java/openjdk6-jre were removed from the ports tree
(r512663). Now this patch completely removes remaining stuff from the
ports tree.
PR: 241953 (exp-run)
Reviewed by: glewis
Approved by: portmgr (antoine)
Differential Revision: https://reviews.freebsd.org/D22342 |
2.1.1_1 25 Jun 2019 09:29:51 |
yuri |
devel/spark: Take maintainership. |
2.1.1_1 21 Dec 2018 13:28:06 |
demon |
Drop maintainership, I do not use Spark anymore. |
2.1.1_1 19 Feb 2018 11:10:43 |
antoine |
Reduce dependency on the python2 metaport
PR: 225752
Submitted by: Yasuhiro KIMURA |
2.1.1 04 Jun 2017 10:35:54 |
demon |
Update to version 2.1.1. |
2.1.0_1 26 Apr 2017 12:10:20 |
miwi |
- Fix shebangs
- Fix plist
- Bump PORTREVISION |
2.1.0 16 Apr 2017 12:00:13 |
demon |
bash is needed on build stage too. |
2.1.0 27 Mar 2017 13:19:27 |
demon |
Update to version 2.1.0.
Submitted by: Mark Dixon <mnd999@gmail.com> |
1.6.1 02 Nov 2016 11:34:21 |
demon |
Use {} to denote variable.
PR: 214005 |
1.6.1 17 Apr 2016 12:28:10 |
demon |
Allow to build hadoop stack with java-1.8, there are several reports
it is working fine. |
1.6.1 10 Apr 2016 13:39:47 |
marino |
devel/spark: Avoid building in /root ($HOME)
Following other maven ports, set user.home to $WRKDIR to prevent writing
in /root or $HOME during the build, which is a file system violation.
While here, change the do-install target to respect 80 columns.
PR: 208666
Approved by: maintainer (demon@) (notice the 666 suffix?) |
1.6.1 01 Apr 2016 14:00:57 |
mat |
Remove ${PORTSDIR}/ from dependencies, categories d, e, f, and g.
With hat: portmgr
Sponsored by: Absolight |
1.6.1 13 Mar 2016 13:11:41 |
demon |
Update to version 1.6.1. |
1.6.0 30 Jan 2016 21:29:42 |
demon |
Update to version 1.6.0.
Submitted by: Mark Dixon <mnd999@gmail.com> |
1.5.2 19 Nov 2015 09:25:05 |
demon |
Update to version 1.5.2.
Ensure that correct version of Java is used to build sources by passing
JAVA_HOME to MAKE_ENV.
Submitted by: Mark Dixon <mnd999@gmail.com> |
1.5.1 06 Nov 2015 15:14:47 |
demon |
Update to version 1.5.1.
Submitted by: Mark Dixon <mnd999@gmail.com> |
1.2.1 11 May 2015 18:34:58 |
mat |
Cleanup DIST* variables.
When appropriate:
- Try to use DISTVERSION{SUF,PRE}FIX
- Replace PORTNAME-PORTVERSION by DISTNAME
- Convert MASTER_SITES to use macros
- Other light cleanup
With hat: portmgr
Sponsored by: Absolight |
1.2.1 15 Feb 2015 19:58:54 |
demon |
Update to version 1.2.1. |
1.2.0 20 Dec 2014 18:26:32 |
demon |
New port: Apache Spark.
Apache Spark is a fast and general-purpose cluster computing system. It
provides high-level APIs in Java, Scala and Python, and an optimized engine
that supports general execution graphs. It also supports a rich set of
higher-level tools including Spark SQL for SQL and structured data processing,
MLlib for machine learning, GraphX for graph processing, and Spark Streaming.
WWW: http://spark.apache.org/ |