Skip to content

Commit ec043b8

Browse files
Merge pull request #66 from astrolabsoftware/packageName
Package name
2 parents 288bf87 + 3c37803 commit ec043b8

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

55 files changed

+188
-175
lines changed

CHANGELOG.md

Lines changed: 9 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,12 @@
1+
## 0.1.3
2+
3+
- Add KNN routines ([KNN](https://github.com/astrolabsoftware/spark3D/pull/59), [KNN](https://github.com/astrolabsoftware/spark3D/pull/60), [KNN](https://github.com/astrolabsoftware/spark3D/pull/62))
4+
- Unify API to load data ([Point3DRDD](https://github.com/astrolabsoftware/spark3D/pull/63), [SphereRDD](https://github.com/astrolabsoftware/spark3D/pull/64))
5+
- Speed-up cross-match methods by using native Scala methods ([Scala](https://github.com/astrolabsoftware/spark3D/pull/58))
6+
- Add a new website + spark3D belongs to AstroLab Software ([website](https://astrolabsoftware.github.io/))
7+
- Update tutorials ([tuto](https://astrolabsoftware.github.io/spark3D/).
8+
- Few fixes here and there...
9+
110
## 0.1.1
211

312
- Add scripts to generate test data ([PR](https://github.com/astrolabsoftware/spark3D/pull/34))

README.md

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -9,6 +9,7 @@
99
- [05/2018] **GSoC 2018**: spark3D has been selected to the Google Summer of Code (GSoC) 2018. Congratulation to [@mayurdb](https://github.com/mayurdb) who will work on the project this year!
1010
- [06/2018] **Release**: version 0.1.0, 0.1.1
1111
- [07/2018] **New location**: spark3D is an official project of [AstroLab Software](https://astrolabsoftware.github.io/)!
12+
- [07/2018] **Release**: version 0.1.3
1213

1314
## Installation and tutorials
1415

@@ -21,3 +22,7 @@ See our amazing [website](https://astrolabsoftware.github.io/spark3D/)!
2122
* Mayur Bhosale (mayurdb31 at gmail.com) -- GSoC 2018.
2223

2324
Contributing to spark3D: see [CONTRIBUTING](https://github.com/astrolabsoftware/spark3D/blob/master/CONTRIBUTING.md).
25+
26+
## Support
27+
28+
<p align="center"><img width="100" src="https://github.com/astrolabsoftware/spark-fits/raw/master/pic/lal_logo.jpg"/> <img width="100" src="https://github.com/astrolabsoftware/spark-fits/raw/master/pic/psud.png"/> <img width="100" src="https://github.com/astrolabsoftware/spark-fits/raw/master/pic/1012px-Centre_national_de_la_recherche_scientifique.svg.png"/></p>

build.sbt

Lines changed: 3 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -19,8 +19,7 @@ import xerial.sbt.Sonatype._
1919
lazy val root = (project in file(".")).
2020
settings(
2121
inThisBuild(List(
22-
version := "0.1.2"
23-
// mainClass in Compile := Some("com.sparkfits.examples.OnionSpace")
22+
version := "0.1.3"
2423
)),
2524
// Name of the application
2625
name := "spark3D",
@@ -36,7 +35,7 @@ lazy val root = (project in file(".")).
3635
// Do not publish artifact in test
3736
publishArtifact in Test := false,
3837
// Exclude runner class for the coverage
39-
coverageExcludedPackages := "<empty>;com.spark3d.examples*",
38+
coverageExcludedPackages := "<empty>;com.astrolabsoftware.spark3d.examples*",
4039
// Excluding Scala library JARs that are included in the binary Scala distribution
4140
// assemblyOption in assembly := (assemblyOption in assembly).value.copy(includeScala = false),
4241
// Shading to avoid conflicts with pre-installed nom.tam.fits library
@@ -47,7 +46,7 @@ lazy val root = (project in file(".")).
4746
"org.apache.spark" %% "spark-core" % "2.1.0" % "provided",
4847
"org.apache.spark" %% "spark-sql" % "2.1.0" % "provided",
4948
// For loading FITS files
50-
"com.github.JulienPeloton" %% "spark-fits" % "0.4.0",
49+
"com.github.astrolabsoftware" %% "spark-fits" % "0.4.0",
5150
// "org.datasyslab" % "geospark" % "1.1.3",
5251
// Uncomment if you want to trigger visualisation
5352
// "com.github.haifengl" % "smile-plot" % "1.5.1",

docs/01_installation.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -77,7 +77,7 @@ toto:~$ spark-shell --jars $JARS --packages $PACKAGES
7777
You will be able to import anything from spark3D
7878

7979
```scala
80-
scala> import com.spark3d.geometryObjects.Point3D
80+
scala> import com.astrolabsoftware.spark3d.geometryObjects.Point3D
8181
scala> // etc...
8282
```
8383
Note that if you make a fat jar (that is building with `sbt assembly` and not `sbt package`), you do not need to specify external dependencies as they are already included in the resulting jar:

docs/02_introduction.md

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -14,7 +14,7 @@ spark3D supports various 3D shapes: points (`Point3D`), spherical shells (`Shell
1414
### Point3D
1515

1616
```scala
17-
import com.spark3d.geometryObjects.Point3D
17+
import com.astrolabsoftware.spark3d.geometryObjects.Point3D
1818

1919
// Cartesian coordinates
2020
val points = new Point3D(x: Double, y: Double, z: Double, isSpherical: Boolean = false)
@@ -26,7 +26,7 @@ val points = new Point3D(r: Double, theta: Double, phi: Double, isSpherical: Boo
2626
### Shells and Spheres
2727

2828
```scala
29-
import com.spark3d.geometryObjects.ShellEnvelope
29+
import com.astrolabsoftware.spark3d.geometryObjects.ShellEnvelope
3030

3131
// Shell from 3D coordinates + inner/outer radii
3232
val shells = new ShellEnvelope(x: Double, y: Double, z: Double, isSpherical: Boolean, innerRadius: Double, outerRadius: Double)
@@ -44,7 +44,7 @@ val spheres = new ShellEnvelope(center: Point3D, isSpherical: Boolean, radius: D
4444
### Boxes
4545

4646
```scala
47-
import com.spark3d.geometryObjects.BoxEnvelope
47+
import com.astrolabsoftware.spark3d.geometryObjects.BoxEnvelope
4848

4949
// Box from region defined by three (cartesian) coordinates.
5050
val boxes = new BoxEnvelope(p1: Point3D, p2: Point3D, p3: Point3D)
@@ -68,7 +68,7 @@ In this tutorial we will review the steps to simply create RDD from 3D data sets
6868
A point is an object with 3 spatial coordinates. In spark3D, you can choose the coordinate system between cartesian `(x, y, z)` and spherical `(r, theta, phi)`. Let's suppose we have a text file (CSV, JSON, or TXT) whose columns are labeled `x`, `y` and `z`, the cartesian coordinates of points:
6969

7070
```scala
71-
import com.spark3d.spatial3DRDD.Point3DRDD
71+
import com.astrolabsoftware.spark3d.spatial3DRDD.Point3DRDD
7272

7373
// We assume filename contains at least 3 columns whose names are `colnames`
7474
// Order of columns in the file does not matter, as they will be re-aranged
@@ -79,7 +79,7 @@ val pointRDD = new Point3DRDD(spark: SparkSession, filename: String, colnames: S
7979
With FITS data, with data in the HDU #1, you would just do
8080

8181
```scala
82-
import com.spark3d.spatial3DRDD.Point3DRDD
82+
import com.astrolabsoftware.spark3d.spatial3DRDD.Point3DRDD
8383

8484
// We assume hdu#1 of filename contains at least 3 columns whose names are `colnames`
8585
// Order of columns in the file does not matter, as they will be re-aranged
@@ -96,7 +96,7 @@ A sphere is defined by its center (3 spatial coordinates) plus a radius.
9696
In spark3D, you can choose the coordinate system of the center between cartesian `(x, y, z)` and spherical `(r, theta, phi)`. Let's suppose we have a text file (CSV, JSON, or TXT) whose columns are labeled `r`, `theta`, `phi`, the spherical coordinates and `radius`:
9797

9898
```scala
99-
import com.spark3d.spatial3DRDD.SphereRDD
99+
import com.astrolabsoftware.spark3d.spatial3DRDD.SphereRDD
100100

101101
// We assume filename contains at least 4 columns whose names are `colnames`.
102102
// Order of columns in the file does not matter, as they will be re-aranged

docs/03_partitioning.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -25,8 +25,8 @@ There are currently 2 partitioning implemented in the library:
2525
In the following example, we load `Point3D` data, and we re-partition it with the onion partitioning
2626

2727
```scala
28-
import com.spark3d.spatial3DRDD.Point3DRDD
29-
import com.spark3d.utils.GridType
28+
import com.astrolabsoftware.spark3d.spatial3DRDD.Point3DRDD
29+
import com.astrolabsoftware.spark3d.utils.GridType
3030

3131
import org.apache.spark.sql.SparkSession
3232

@@ -56,8 +56,8 @@ val pointRDD_partitioned = pointRDD.spatialPartitioning(GridType.LINEARONIONGRID
5656
In the following example, we load `Point3D` data, and we re-partition it with the octree partitioning
5757

5858
```scala
59-
import com.spark3d.spatial3DRDD.Point3DRDD
60-
import com.spark3d.utils.GridType
59+
import com.astrolabsoftware.spark3d.spatial3DRDD.Point3DRDD
60+
import com.astrolabsoftware.spark3d.utils.GridType
6161

6262
import org.apache.spark.sql.SparkSession
6363

docs/04_query.md

Lines changed: 8 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -14,9 +14,9 @@ The spark3D library contains a number of methods and tools to manipulate 3D RDD.
1414
A Envelope query takes as input a `RDD[Shape3D]` and an envelope, and returns all objects in the RDD intersecting the envelope (contained in and crossing the envelope):
1515

1616
```scala
17-
import com.spark3d.spatial3DRDD.Point3DRDD
18-
import com.spark3d.geometryObjects.{Point3D, ShellEnvelope}
19-
import com.spark3d.spatialOperator.RangeQuery
17+
import com.astrolabsoftware.spark3d.spatial3DRDD.Point3DRDD
18+
import com.astrolabsoftware.spark3d.geometryObjects.{Point3D, ShellEnvelope}
19+
import com.astrolabsoftware.spark3d.spatialOperator.RangeQuery
2020

2121
import org.apache.spark.sql.SparkSession
2222

@@ -53,7 +53,7 @@ Envelope = Sphere |Envelope = Box
5353
A cross-match takes as input two data sets, and return objects matching based on the center distance, or pixel index of objects. Note that performing a cross-match between a data set of N elements and another of M elements is a priori a NxM operation - so it can be very costly! Let's load two `Point3D` data sets:
5454

5555
```scala
56-
import com.spark3d.spatial3DRDD.Point3DRDD
56+
import com.astrolabsoftware.spark3d.spatial3DRDD.Point3DRDD
5757

5858
import org.apache.spark.sql.SparkSession
5959

@@ -77,8 +77,8 @@ By default, the two sets are partitioned randomly (in the sense points spatially
7777
In order to decrease the cost of performing the cross-match, you need to partition the two data sets the same way. By doing so, you will cross-match only points belonging to the same partition. For a large number of partitions, you will decrease significantly the cost:
7878

7979
```scala
80-
import com.spark3d.utils.GridType
81-
import com.spark3d.spatialPartitioning.SpatialPartitioner
80+
import com.astrolabsoftware.spark3d.utils.GridType
81+
import com.astrolabsoftware.spark3d.spatialPartitioning.SpatialPartitioner
8282

8383
// nPart is the wanted number of partitions. Default is setA_raw partition number.
8484
// For the spatial partitioning, you can currently choose between LINEARONIONGRID, or OCTREE.
@@ -114,7 +114,7 @@ Currently, we implemented two methods to perform a cross-match:
114114
Here is an example which returns only elements from B with counterpart in A using distance center:
115115

116116
```scala
117-
import com.spark3d.spatialOperator.CenterCrossMatch
117+
import com.astrolabsoftware.spark3d.spatialOperator.CenterCrossMatch
118118

119119
// Distance threshold for the match
120120
val epsilon = 0.004
@@ -127,7 +127,7 @@ val xMatchCenter = CenterCrossMatch
127127
and the same using the Healpix indices:
128128

129129
```scala
130-
import com.spark3d.spatialOperator.PixelCrossMatch
130+
import com.astrolabsoftware.spark3d.spatialOperator.PixelCrossMatch
131131

132132
// Shell resolution for Healpix indexing
133133
val nside = 512

docs/_pages/home.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ header:
77
cta_label: "<i class='fas fa-download'></i> Install Now"
88
cta_url: "/docs/installation/"
99
caption:
10-
excerpt: 'Spark extension for processing large-scale 3D data sets: Astrophysics, High Energy Physics, Meteorology, ...<br /> <small><a href="https://github.com/astrolabsoftware/spark3D/releases/tag/0.1.1">Latest release v0.1.1</a></small><br /><br /> {::nomarkdown}<iframe style="display: inline-block;" src="https://ghbtns.com/github-btn.html?user=astrolabsoftware&repo=spark3D&type=star&count=true&size=large" frameborder="0" scrolling="0" width="160px" height="30px"></iframe> <iframe style="display: inline-block;" src="https://ghbtns.com/github-btn.html?user=astrolabsoftware&repo=spark3D&type=fork&count=true&size=large" frameborder="0" scrolling="0" width="158px" height="30px"></iframe>{:/nomarkdown}'
10+
excerpt: 'Spark extension for processing large-scale 3D data sets: Astrophysics, High Energy Physics, Meteorology, ...<br /> <small><a href="https://github.com/astrolabsoftware/spark3D/releases/tag/0.1.3">Latest release v0.1.3</a></small><br /><br /> {::nomarkdown}<iframe style="display: inline-block;" src="https://ghbtns.com/github-btn.html?user=astrolabsoftware&repo=spark3D&type=star&count=true&size=large" frameborder="0" scrolling="0" width="160px" height="30px"></iframe> <iframe style="display: inline-block;" src="https://ghbtns.com/github-btn.html?user=astrolabsoftware&repo=spark3D&type=fork&count=true&size=large" frameborder="0" scrolling="0" width="158px" height="30px"></iframe>{:/nomarkdown}'
1111
feature_row:
1212
- image_path:
1313
alt:

examples/jupyter/CrossMatch.ipynb

Lines changed: 7 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -154,7 +154,7 @@
154154
},
155155
"outputs": [],
156156
"source": [
157-
"import com.spark3d.spatial3DRDD._\n",
157+
"import com.astrolabsoftware.spark3d.spatial3DRDD._\n",
158158
"import org.apache.spark.sql.SparkSession\n",
159159
"val spark = SparkSession.builder().appName(\"Xmatch\").getOrCreate()\n",
160160
"\n",
@@ -192,8 +192,8 @@
192192
},
193193
"outputs": [],
194194
"source": [
195-
"import com.spark3d.utils.GridType\n",
196-
"import com.spark3d.spatialPartitioning.SpatialPartitioner\n",
195+
"import com.astrolabsoftware.spark3d.utils.GridType\n",
196+
"import com.astrolabsoftware.spark3d.spatialPartitioning.SpatialPartitioner\n",
197197
"\n",
198198
"// As we are in local mode, and the file is very small, the RDD pointRDD has only 1 partition.\n",
199199
"// For the sake of this example, let's increase the number of partition to 100.\n",
@@ -244,7 +244,7 @@
244244
}
245245
],
246246
"source": [
247-
"import com.spark3d.spatialOperator.PixelCrossMatch\n",
247+
"import com.astrolabsoftware.spark3d.spatialOperator.PixelCrossMatch\n",
248248
"\n",
249249
"// Shell resolution\n",
250250
"val nside = 512\n",
@@ -298,7 +298,7 @@
298298
}
299299
],
300300
"source": [
301-
"import com.spark3d.spatialOperator.CenterCrossMatch\n",
301+
"import com.astrolabsoftware.spark3d.spatialOperator.CenterCrossMatch\n",
302302
"\n",
303303
"// Distance threshold for the match\n",
304304
"val epsilon = 0.004\n",
@@ -336,9 +336,9 @@
336336
"import javax.swing.JFrame\n",
337337
"import javax.swing.JPanel\n",
338338
"\n",
339-
"import com.spark3d.utils.Utils.sphericalToCartesian\n",
339+
"import com.astrolabsoftware.spark3d.utils.Utils.sphericalToCartesian\n",
340340
"import org.apache.spark.rdd.RDD\n",
341-
"import com.spark3d.geometryObjects._\n",
341+
"import com.astrolabsoftware.spark3d.geometryObjects._\n",
342342
"\n",
343343
"\n",
344344
"/** Define palette of colors */\n",

examples/jupyter/onion_partitioning.ipynb

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -101,7 +101,7 @@
101101
"metadata": {},
102102
"outputs": [],
103103
"source": [
104-
"import com.spark3d.spatial3DRDD._\n",
104+
"import com.astrolabsoftware.spark3d.spatial3DRDD._\n",
105105
"import org.apache.spark.sql.SparkSession\n",
106106
"val spark = SparkSession.builder().appName(\"OnionSpace\").getOrCreate()\n",
107107
"\n",
@@ -137,7 +137,7 @@
137137
},
138138
"outputs": [],
139139
"source": [
140-
"import com.spark3d.utils.GridType\n",
140+
"import com.astrolabsoftware.spark3d.utils.GridType\n",
141141
"\n",
142142
"// As we are in local mode, and the file is very small, the RDD pointRDD has only 1 partition.\n",
143143
"// For the sake of this example, let's increase the number of partition to 5.\n",
@@ -201,9 +201,9 @@
201201
"import javax.swing.JFrame\n",
202202
"import javax.swing.JPanel\n",
203203
"\n",
204-
"import com.spark3d.utils.Utils.sphericalToCartesian\n",
204+
"import com.astrolabsoftware.spark3d.utils.Utils.sphericalToCartesian\n",
205205
"import org.apache.spark.rdd.RDD\n",
206-
"import com.spark3d.geometryObjects._\n",
206+
"import com.astrolabsoftware.spark3d.geometryObjects._\n",
207207
"\n",
208208
"/** \n",
209209
" * Define palette of colors \n",

0 commit comments

Comments
 (0)