Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[GraphX] Improve LiveJournalPageRank example #4917

Closed
wants to merge 5 commits into from

Conversation

jackylk
Copy link
Contributor

@jackylk jackylk commented Mar 5, 2015

  1. Removed unnecessary import
  2. Modified usage print since user must specify the --numEPart parameter as it is required in Analytics.main

@SparkQA
Copy link

SparkQA commented Mar 5, 2015

Test build #28299 has started for PR 4917 at commit 5caae76.

  • This patch merges cleanly.

@@ -17,11 +17,6 @@

package org.apache.spark.examples.graphx

import org.apache.spark.SparkContext._
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Are you sure you can remove this one? it imports implicits right?
This change loses the description of the parameter too.
This is on the border of non-trivial, but if it's merely fixing the usage note, seems OK.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, I think it is OK.
The description of the parameter will be output by the Analytics.Main, looks like:

JackydeMacBook-Pro:spark jackylee$ bin/run-example graphx.LiveJournalPageRank ../data/Wiki-Vote.txt
Set the number of edge partitions using --numEPart.
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
JackydeMacBook-Pro:spark jackylee$ 

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, but, why remove the description, which already existed in the help message in this file?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, you are right. I modified back, just remove the default value description

@SparkQA
Copy link

SparkQA commented Mar 5, 2015

Test build #28299 has finished for PR 4917 at commit 5caae76.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@AmplabJenkins
Copy link

Test PASSed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/28299/
Test PASSed.

@SparkQA
Copy link

SparkQA commented Mar 6, 2015

Test build #28319 has started for PR 4917 at commit 87be83b.

  • This patch merges cleanly.

@SparkQA
Copy link

SparkQA commented Mar 6, 2015

Test build #28319 has finished for PR 4917 at commit 87be83b.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@AmplabJenkins
Copy link

Test PASSed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/28319/
Test PASSed.

" [--tol=<tolerance>]\n" +
" The tolerance allowed at convergence (smaller => more accurate). Default is " +
"0.001.\n" +
" [--output=<output_file>]\n" +
" If specified, the file to write the ranks to.\n" +
" [--numEPart=<num_edge_partitions>]\n" +
" The number of partitions for the graph's edge RDD. Default is 4.\n" +
" The number of partitions for the graph's edge RDD.\n" +
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Eh, but now you have put the option in twice! why not just move it up and remove the square braces to mark it non-optional? keep everything else the same.

@SparkQA
Copy link

SparkQA commented Mar 7, 2015

Test build #28369 has started for PR 4917 at commit b6235e6.

  • This patch merges cleanly.

@SparkQA
Copy link

SparkQA commented Mar 7, 2015

Test build #28369 has finished for PR 4917 at commit b6235e6.

  • This patch fails Scala style tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@AmplabJenkins
Copy link

Test FAILed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/28369/
Test FAILed.

@SparkQA
Copy link

SparkQA commented Mar 8, 2015

Test build #28370 has started for PR 4917 at commit c0df8f2.

  • This patch merges cleanly.

@SparkQA
Copy link

SparkQA commented Mar 8, 2015

Test build #28370 has finished for PR 4917 at commit c0df8f2.

  • This patch fails PySpark unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@AmplabJenkins
Copy link

Test FAILed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/28370/
Test FAILed.

@@ -60,6 +60,9 @@ fi
#
# For backwards-compatibility, we retain the old IPYTHON and IPYTHON_OPTS variables.

export PYSPARK_DRIVER_PYTHON=ipython
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

OK, but now there is this unrelated change in your PR.

@SparkQA
Copy link

SparkQA commented Mar 8, 2015

Test build #28373 has started for PR 4917 at commit 6c07682.

  • This patch merges cleanly.

@SparkQA
Copy link

SparkQA commented Mar 8, 2015

Test build #28373 has finished for PR 4917 at commit 6c07682.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@AmplabJenkins
Copy link

Test PASSed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/28373/
Test PASSed.

@asfgit asfgit closed this in 55b1b32 Mar 8, 2015
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants