[IPython-dev] pixiedebugger: can't install

Lisa Bang lisagbang at gmail.com
Mon Jan 28 10:08:59 EST 2019


Hi Dude,

Yes, it looks like when subprocess calls pyspark to get its version,
pyspark returns no version.  I do see that this problem has happened
recently for others <https://github.com/pixiedust/pixiedust/issues/741> as
well also.  Maybe could be solved with editing the kernel.json in your
Jupyter directory for the pixiedust kernel to include pyspark, typically at
/Users/xxxx/Library/Jupyter/kernels/pythonwithpixiedustspark22/kernel.json but
this kind of seems like a pain.

For reference, I have put the output from my code with the same error and
it working after. Perhaps also ensuring the package versions of py4j and
pyspark are the same would help.

Best,
Lisa


XXXX:Documents xxxx$ jupyter pixiedust install

*Step 1: PIXIEDUST_HOME: /Users/xxxx/pixiedust*

     Keep y/n [y]? y

*Step 2: SPARK_HOME: /Users/xxxx/pixiedust/bin/spark*

     Keep y/n [y]? y

Select an existing spark install or create a new one

*1. spark-2.3.2-bin-hadoop2.7*

*2. Create a new spark Install*

     Enter your selection: 1

Traceback (most recent call last):

  File "/Users/xxxx/anaconda3/bin/jupyter-pixiedust", line 11, in <module>

    sys.exit(main())

  File
"/Users/xxxx/anaconda3/lib/python3.7/site-packages/install/pixiedustapp.py",
line 41, in main

    PixiedustJupyterApp.launch_instance()

  File
"/Users/xxxx/anaconda3/lib/python3.7/site-packages/traitlets/config/application.py",
line 657, in launch_instance

    app.initialize(argv)

  File "<decorator-gen-2>", line 2, in initialize

  File
"/Users/xxxx/anaconda3/lib/python3.7/site-packages/traitlets/config/application.py",
line 87, in catch_config_error

    return method(app, *args, **kwargs)

  File
"/Users/xxxx/anaconda3/lib/python3.7/site-packages/traitlets/config/application.py",
line 296, in initialize

    self.parse_command_line(argv)

  File "<decorator-gen-4>", line 2, in parse_command_line

  File
"/Users/xxxx/anaconda3/lib/python3.7/site-packages/traitlets/config/application.py",
line 87, in catch_config_error

    return method(app, *args, **kwargs)

  File
"/Users/xxxx/anaconda3/lib/python3.7/site-packages/traitlets/config/application.py",
line 514, in parse_command_line

    return self.initialize_subcommand(subc, subargv)

  File "<decorator-gen-3>", line 2, in initialize_subcommand

  File
"/Users/xxxx/anaconda3/lib/python3.7/site-packages/traitlets/config/application.py",
line 87, in catch_config_error

    return method(app, *args, **kwargs)

  File
"/Users/xxxx/anaconda3/lib/python3.7/site-packages/traitlets/config/application.py",
line 452, in initialize_subcommand

    self.subapp.initialize(argv)

  File "<decorator-gen-6>", line 2, in initialize

  File
"/Users/xxxx/anaconda3/lib/python3.7/site-packages/traitlets/config/application.py",
line 87, in catch_config_error

    return method(app, *args, **kwargs)

  File
"/Users/xxxx/anaconda3/lib/python3.7/site-packages/jupyter_core/application.py",
line 238, in initialize

    self.parse_command_line(argv)

  File
"/Users/xxxx/anaconda3/lib/python3.7/site-packages/install/createKernel.py",
line 154, in parse_command_line

    spark_version = self.get_spark_version()

  File
"/Users/xxxx/anaconda3/lib/python3.7/site-packages/install/createKernel.py",
line 379, in get_spark_version

    pyspark_out = subprocess.check_output([pyspark, "--version"],
stderr=subprocess.STDOUT).decode("utf-8")

  File "/Users/xxxx/anaconda3/lib/python3.7/subprocess.py", line 389, in
check_output

    **kwargs).stdout

  File "/Users/xxxx/anaconda3/lib/python3.7/subprocess.py", line 481, in run

    output=stdout, stderr=stderr)

subprocess.CalledProcessError: Command
'['/Users/xxxx/pixiedust/bin/spark/spark-2.3.2-bin-hadoop2.7/bin/pyspark',
'--version']' returned non-zero exit status 1.

XXXX:Documents xxxx$ pip install pyspark

Collecting pyspark

  Downloading
https://files.pythonhosted.org/packages/88/01/a37e827c2d80c6a754e40e99b9826d978b55254cc6c6672b5b08f2e18a7f/pyspark-2.4.0.tar.gz
(213.4MB)

    100% |████████████████████████████████| 213.4MB 135kB/s

Collecting py4j==0.10.7 (from pyspark)

  Downloading https://files.pythonhosted.org/packages/e3/53/c737818eb9a7dc32a7cd4f1396e787bd94200c3997c72c1dbe028587bd76/py4j-0.10.7-py2.py3-none-any.whl
(197kB)

    100% |████████████████████████████████| 204kB 10.8MB/s

Building wheels for collected packages: pyspark

  Running setup.py bdist_wheel for pyspark ... done

  Stored in directory:
/Users/xxxx/Library/Caches/pip/wheels/cd/54/c2/abfcc942eddeaa7101228ebd6127a30dbdf903c72db4235b23

Successfully built pyspark

Installing collected packages: py4j, pyspark

Successfully installed py4j-0.10.7 pyspark-2.4.0

XXXX:Documents xxxx$ jupyter pixiedust install

*Step 1: PIXIEDUST_HOME: /Users/xxxx/pixiedust*

     Keep y/n [y]? y

*Step 2: SPARK_HOME: /Users/xxxx/pixiedust/bin/spark*

     Keep y/n [y]? y

Select an existing spark install or create a new one

*1. spark-2.3.2-bin-hadoop2.7*

*2. Create a new spark Install*

     Enter your selection: 2

*What version would you like to download? 1.6.3, 2.0.2, 2.1.0, 2.2.0, 2.3.2
[2.3.2]: *2.2.0

SPARK_HOME will be set to
/Users/xxxx/pixiedust/bin/spark/spark-2.2.0-bin-hadoop2.7

Downloading Spark 2.2.0

Extracting Spark 2.2.0 to /Users/xxxx/pixiedust/bin/spark

Traceback (most recent call last):

  File "/Users/xxxx/anaconda3/bin/jupyter-pixiedust", line 11, in <module>

    sys.exit(main())

  File
"/Users/xxxx/anaconda3/lib/python3.7/site-packages/install/pixiedustapp.py",
line 41, in main

    PixiedustJupyterApp.launch_instance()

  File
"/Users/xxxx/anaconda3/lib/python3.7/site-packages/traitlets/config/application.py",
line 657, in launch_instance

    app.initialize(argv)

  File "<decorator-gen-2>", line 2, in initialize

  File
"/Users/xxxx/anaconda3/lib/python3.7/site-packages/traitlets/config/application.py",
line 87, in catch_config_error

    return method(app, *args, **kwargs)

  File
"/Users/xxxx/anaconda3/lib/python3.7/site-packages/traitlets/config/application.py",
line 296, in initialize

    self.parse_command_line(argv)

  File "<decorator-gen-4>", line 2, in parse_command_line

  File
"/Users/xxxx/anaconda3/lib/python3.7/site-packages/traitlets/config/application.py",
line 87, in catch_config_error

    return method(app, *args, **kwargs)

  File
"/Users/xxxx/anaconda3/lib/python3.7/site-packages/traitlets/config/application.py",
line 514, in parse_command_line

    return self.initialize_subcommand(subc, subargv)

  File "<decorator-gen-3>", line 2, in initialize_subcommand

  File
"/Users/xxxx/anaconda3/lib/python3.7/site-packages/traitlets/config/application.py",
line 87, in catch_config_error

    return method(app, *args, **kwargs)

  File
"/Users/xxxx/anaconda3/lib/python3.7/site-packages/traitlets/config/application.py",
line 452, in initialize_subcommand

    self.subapp.initialize(argv)

  File "<decorator-gen-6>", line 2, in initialize

  File
"/Users/xxxx/anaconda3/lib/python3.7/site-packages/traitlets/config/application.py",
line 87, in catch_config_error

    return method(app, *args, **kwargs)

  File
"/Users/xxxx/anaconda3/lib/python3.7/site-packages/jupyter_core/application.py",
line 238, in initialize

    self.parse_command_line(argv)

  File
"/Users/xxxx/anaconda3/lib/python3.7/site-packages/install/createKernel.py",
line 154, in parse_command_line

    spark_version = self.get_spark_version()

  File
"/Users/xxxx/anaconda3/lib/python3.7/site-packages/install/createKernel.py",
line 379, in get_spark_version

    pyspark_out = subprocess.check_output([pyspark, "--version"],
stderr=subprocess.STDOUT).decode("utf-8")

  File "/Users/xxxx/anaconda3/lib/python3.7/subprocess.py", line 389, in
check_output

    **kwargs).stdout

  File "/Users/xxxx/anaconda3/lib/python3.7/subprocess.py", line 481, in run

    output=stdout, stderr=stderr)

subprocess.CalledProcessError: Command
'['/Users/xxxx/pixiedust/bin/spark/spark-2.2.0-bin-hadoop2.7/bin/pyspark',
'--version']' returned non-zero exit status 1.

XXXX:Documents xxxx$
/Users/xxxx/pixiedust/bin/spark/spark-2.2.0-bin-hadoop2.7/bin/pyspark',
'--version'

> ,

> '

-bash:
/Users/xxxx/pixiedust/bin/spark/spark-2.2.0-bin-hadoop2.7/bin/pyspark,
--version

,

: No such file or directory

XXXX:Documents xxxx$
/Users/xxxx/pixiedust/bin/spark/spark-2.2.0-bin-hadoop2.7/bin/pyspark

No Java runtime present, requesting install.

XXXX:Documents xxxx$ conda install pyspark

Solving environment: done



## Package Plan ##



  environment location: /Users/xxxx/anaconda3



  added / updated specs:

    - pyspark





The following packages will be downloaded:



    package                    |            build

    ---------------------------|-----------------

    py4j-0.10.7                |           py37_0         251 KB

    pyspark-2.4.0              |           py37_0       203.4 MB

    conda-4.6.1                |           py37_0         1.7 MB

    ------------------------------------------------------------

                                           Total:       205.4 MB



The following NEW packages will be INSTALLED:



    py4j:    0.10.7-py37_0

    pyspark: 2.4.0-py37_0



The following packages will be UPDATED:



    conda:   4.5.12-py37_0 --> 4.6.1-py37_0



Proceed ([y]/n)? y





Downloading and Extracting Packages

py4j-0.10.7          | 251 KB    |
#################################################################### | 100%

pyspark-2.4.0        | 203.4 MB  |
#################################################################### | 100%

conda-4.6.1          | 1.7 MB    |
#################################################################### | 100%

Preparing transaction: done

Verifying transaction: done

Executing transaction: done

(base) XXXX:Documents xxxx$ jupyter pixiedust install

*Step 1: PIXIEDUST_HOME: /Users/xxxx/pixiedust*

     Keep y/n [y]? y

*Step 2: SPARK_HOME: /Users/xxxx/pixiedust/bin/spark*

     Keep y/n [y]? n

*Step 2: Please enter a SPARK_HOME location: *
/Users/xxxx/spark-2.3.2-bin-hadoop2.7

*Directory /Users/xxxx/sprk-2.3.2-bin-hadoop2.7 does not exist*

     Create y/n [y]? n

*Step 2: SPARK_HOME: /Users/xxxx/pixiedust/bin/spark*

     Keep y/n [y]? n

*Step 2: Please enter a SPARK_HOME location: *
/Users/xxxx/Downloads/spark-2.3.4-bin-hadoop2.7

*Directory /Users/xxxx/Downloads/spark-2.3.4-bin-hadoop2.7 does not exist*

     Create y/n [y]? n

*Step 2: SPARK_HOME: /Users/xxxx/pixiedust/bin/spark*

     Keep y/n [y]? n

*Step 2: Please enter a SPARK_HOME location: *
/Users/xxxx/Downloads/spark-2.3.4-bin-hadoop2.7/bin/pyspark

*Directory /Users/xxxx/Downloads/spark-2.3.4-bin-hadoop2.7/bin/pyspark does
not exist*

     Create y/n [y]? n

*Step 2: SPARK_HOME: /Users/xxxx/pixiedust/bin/spark*

     Keep y/n [y]? y

Select an existing spark install or create a new one

*1. spark-2.3.2-bin-hadoop2.7*

*2. spark-2.2.0-bin-hadoop2.7*

*3. Create a new spark Install*

     Enter your selection: 2

downloaded spark cloudant jar:
/Users/xxxx/pixiedust/bin/cloudant-spark-v2.0.0-185.jar

*Step 3: SCALA_HOME: /Users/xxxx/pixiedust/bin/scala*

     Keep y/n [y]? y

*Directory /Users/xxxx/pixiedust/bin/scala does not contain a valid scala
install*

     Download Scala y/n [y]? y

SCALA_HOME will be set to /Users/xxxx/pixiedust/bin/scala/scala-2.11.8

Downloading Scala 2.11

Extracting Scala 2.11 to /Users/xxxx/pixiedust/bin/scala

*Step 4: Kernel Name: Python-with-Pixiedust_Spark-2.2*

     Keep y/n [y]? y

self.kernelInternalName pythonwithpixiedustspark22

[PixiedustInstall] Installed kernelspec pythonwithpixiedustspark22 in
/Users/xxxx/Library/Jupyter/kernels/pythonwithpixiedustspark22

Downloading intro notebooks into /Users/xxxx/pixiedust/notebooks

...
https://github.com/ibm-watson-data-lab/pixiedust/raw/master/notebook/PixieDust
1 - Easy Visualizations.ipynb : *done*

...
https://github.com/ibm-watson-data-lab/pixiedust/raw/master/notebook/PixieDust
2 - Working with External Data.ipynb : *done*

...
https://github.com/ibm-watson-data-lab/pixiedust/raw/master/notebook/PixieDust
3 - Scala and Python.ipynb : *done*

...
https://github.com/ibm-watson-data-lab/pixiedust/raw/master/notebook/PixieDust
4 - Add External Spark Packages.ipynb : *done*

...
https://github.com/ibm-watson-data-lab/pixiedust/raw/master/notebook/PixieDust
5 - Stash to Cloudant.ipynb : *done*

...
https://github.com/ibm-watson-data-lab/pixiedust/raw/master/notebook/PixieDust
Contribute.ipynb : *done*





####################################################################################################

#    Congratulations: Kernel Python-with-Pixiedust_Spark-2.2 was
successfully created in
/Users/xxxx/Library/Jupyter/kernels/pythonwithpixiedustspark22

#    You can start the Notebook server with the following command:

#        *jupyter notebook /Users/xxxx/pixiedust/notebooks*

####################################################################################################


On Sun, Jan 27, 2019, 11:55 PM The Dude <ToTheDude at zoho.com> wrote:

> Hi Lisa,
> thanks for your help.
>
> Unfortunately, I get the same error after installing spark with conda.
> Here’s the output from my terminal:
>
>
> (py37ana) adula-5:~ dude$ jupyter pixiedust install
> Step 1: PIXIEDUST_HOME: /Users/dude/pixiedust
> Keep y/n [y]? y
> Step 2: SPARK_HOME: /Users/dude/pixiedust/bin/spark
> Keep y/n [y]? y
> Select an existing spark install or create a new one
> 1. spark-2.2.0-bin-hadoop2.7
> 2. Create a new spark Install
> Enter your selection: 1
> Traceback (most recent call last):
>   File "/Users/dude/anaconda/envs/py37ana/bin/jupyter-pixiedust", line 11,
> in <module>
>     sys.exit(main())
>   File
> "/Users/dude/anaconda/envs/py37ana/lib/python3.7/site-packages/install/pixiedustapp.py",
> line 41, in main
>     PixiedustJupyterApp.launch_instance()
>   File
> "/Users/dude/anaconda/envs/py37ana/lib/python3.7/site-packages/traitlets/config/application.py",
> line 657, in launch_instance
>     app.initialize(argv)
>   File "<decorator-gen-2>", line 2, in initialize
>   File
> "/Users/dude/anaconda/envs/py37ana/lib/python3.7/site-packages/traitlets/config/application.py",
> line 87, in catch_config_error
>     return method(app, *args, **kwargs)
>   File
> "/Users/dude/anaconda/envs/py37ana/lib/python3.7/site-packages/traitlets/config/application.py",
> line 296, in initialize
>     self.parse_command_line(argv)
>   File "<decorator-gen-4>", line 2, in parse_command_line
>   File
> "/Users/dude/anaconda/envs/py37ana/lib/python3.7/site-packages/traitlets/config/application.py",
> line 87, in catch_config_error
>     return method(app, *args, **kwargs)
>   File
> "/Users/dude/anaconda/envs/py37ana/lib/python3.7/site-packages/traitlets/config/application.py",
> line 514, in parse_command_line
>     return self.initialize_subcommand(subc, subargv)
>   File "<decorator-gen-3>", line 2, in initialize_subcommand
>   File
> "/Users/dude/anaconda/envs/py37ana/lib/python3.7/site-packages/traitlets/config/application.py",
> line 87, in catch_config_error
>     return method(app, *args, **kwargs)
>   File
> "/Users/dude/anaconda/envs/py37ana/lib/python3.7/site-packages/traitlets/config/application.py",
> line 452, in initialize_subcommand
>     self.subapp.initialize(argv)
>   File "<decorator-gen-6>", line 2, in initialize
>   File
> "/Users/dude/anaconda/envs/py37ana/lib/python3.7/site-packages/traitlets/config/application.py",
> line 87, in catch_config_error
>     return method(app, *args, **kwargs)
>   File
> "/Users/dude/anaconda/envs/py37ana/lib/python3.7/site-packages/jupyter_core/application.py",
> line 238, in initialize
>     self.parse_command_line(argv)
>   File
> "/Users/dude/anaconda/envs/py37ana/lib/python3.7/site-packages/install/createKernel.py",
> line 154, in parse_command_line
>     spark_version = self.get_spark_version()
>   File
> "/Users/dude/anaconda/envs/py37ana/lib/python3.7/site-packages/install/createKernel.py",
> line 379, in get_spark_version
>     pyspark_out = subprocess.check_output([pyspark, "--version"],
> stderr=subprocess.STDOUT).decode("utf-8")
>   File "/Users/dude/anaconda/envs/py37ana/lib/python3.7/subprocess.py",
> line 389, in check_output
>     **kwargs).stdout
>   File "/Users/dude/anaconda/envs/py37ana/lib/python3.7/subprocess.py",
> line 481, in run
>     output=stdout, stderr=stderr)
> subprocess.CalledProcessError: Command
> '['/Users/dude/pixiedust/bin/spark/spark-2.2.0-bin-hadoop2.7/bin/pyspark',
> '--version']' returned non-zero exit status 1.dude
>
>
>
> It seems that the pyspark executable doesn’t run properly.
>
> TheDude
>
>
>
>
>
> On Jan 27, 2019, at 11:26:25, Lisa Bang <lisagbang at gmail.com> wrote:
>
>
> Hi,
>
> Looks like you don't have pyspark installed.  I reproduced your error and
> after running "conda install pyspark" ,  "jupyter pixiedust install"
> finished just fine.
>
> Best,
> Lisa
>
>
> On Sat, Jan 26, 2019 at 11:22 PM TheDude <ToTheDude at zoho.com> wrote:
>
>> Hello,
>>         I would really like to have the pixiedebugger working in my
>> notebook. Unfortunately I ran into problems with the installation.
>>
>> I am following the instructions here: <
>> https://pixiedust.github.io/pixiedust/install.html>
>> For what is worth, I am using Python 3.7 in Anaconda on macOS 10.12.
>>
>> The first part of the installation:
>>         pip install pixiedust
>>
>> is successful, and it ends with:
>>         Successfully built pixiedust mpld3
>>         Installing collected packages: mpld3, geojson, astunparse,
>> markdown, colour, pixiedust
>>         Successfully installed astunparse-1.6.2 colour-0.1.5
>> geojson-2.4.1 markdown-3.0.1 mpld3-0.3 pixiedust-1.1.15
>>
>> The next step, installing a new Jupyter kernel, fails:
>>         jupyter pixiedust install
>>
>> Here’s what I get:
>>
>> (py37ana) adula-5:~ dude$ jupyter pixiedust install
>> Step 1: PIXIEDUST_HOME: /Users/dude/pixiedust
>>         Keep y/n [y]? y
>> Step 2: SPARK_HOME: /Users/dude/pixiedust/bin/spark
>>         Keep y/n [y]? y
>> Select an existing spark install or create a new one
>> 1. spark-2.2.0-bin-hadoop2.7
>> 2. Create a new spark Install
>>         Enter your selection: 1
>> Traceback (most recent call last):
>>  File "/Users/dude/anaconda/envs/py37ana/bin/jupyter-pixiedust", line 11,
>> in <module>
>>    sys.exit(main())
>>  File
>> "/Users/dude/anaconda/envs/py37ana/lib/python3.7/site-packages/install/pixiedustapp.py",
>> line 41, in main
>>    PixiedustJupyterApp.launch_instance()
>>  File
>> "/Users/dude/anaconda/envs/py37ana/lib/python3.7/site-packages/traitlets/config/application.py",
>> line 657, in launch_instance
>>    app.initialize(argv)
>>  File "<decorator-gen-2>", line 2, in initialize
>>  File
>> "/Users/dude/anaconda/envs/py37ana/lib/python3.7/site-packages/traitlets/config/application.py",
>> line 87, in catch_config_error
>>    return method(app, *args, **kwargs)
>>  File
>> "/Users/dude/anaconda/envs/py37ana/lib/python3.7/site-packages/traitlets/config/application.py",
>> line 296, in initialize
>>    self.parse_command_line(argv)
>>  File "<decorator-gen-4>", line 2, in parse_command_line
>>  File
>> "/Users/dude/anaconda/envs/py37ana/lib/python3.7/site-packages/traitlets/config/application.py",
>> line 87, in catch_config_error
>>    return method(app, *args, **kwargs)
>>  File
>> "/Users/dude/anaconda/envs/py37ana/lib/python3.7/site-packages/traitlets/config/application.py",
>> line 514, in parse_command_line
>>    return self.initialize_subcommand(subc, subargv)
>>  File "<decorator-gen-3>", line 2, in initialize_subcommand
>>  File
>> "/Users/dude/anaconda/envs/py37ana/lib/python3.7/site-packages/traitlets/config/application.py",
>> line 87, in catch_config_error
>>    return method(app, *args, **kwargs)
>>  File
>> "/Users/dude/anaconda/envs/py37ana/lib/python3.7/site-packages/traitlets/config/application.py",
>> line 452, in initialize_subcommand
>>    self.subapp.initialize(argv)
>>  File "<decorator-gen-6>", line 2, in initialize
>>  File
>> "/Users/dude/anaconda/envs/py37ana/lib/python3.7/site-packages/traitlets/config/application.py",
>> line 87, in catch_config_error
>>    return method(app, *args, **kwargs)
>>  File
>> "/Users/dude/anaconda/envs/py37ana/lib/python3.7/site-packages/jupyter_core/application.py",
>> line 238, in initialize
>>    self.parse_command_line(argv)
>>  File
>> "/Users/dude/anaconda/envs/py37ana/lib/python3.7/site-packages/install/createKernel.py",
>> line 154, in parse_command_line
>>    spark_version = self.get_spark_version()
>>  File
>> "/Users/dude/anaconda/envs/py37ana/lib/python3.7/site-packages/install/createKernel.py",
>> line 379, in get_spark_version
>>    pyspark_out = subprocess.check_output([pyspark, "--version"],
>> stderr=subprocess.STDOUT).decode("utf-8")
>>  File "/Users/dude/anaconda/envs/py37ana/lib/python3.7/subprocess.py",
>> line 389, in check_output
>>    **kwargs).stdout
>>  File "/Users/dude/anaconda/envs/py37ana/lib/python3.7/subprocess.py",
>> line 481, in run
>>    output=stdout, stderr=stderr)
>> subprocess.CalledProcessError: Command
>> '['/Users/dude/pixiedust/bin/spark/spark-2.2.0-bin-hadoop2.7/bin/pyspark',
>> '--version']' returned non-zero exit status 1.
>>
>>
>> Anybody has an idea on how to proceed?
>>
>> TIA.
>>
>> TheDude
>>
>>
>> _______________________________________________
>> IPython-dev mailing list
>> IPython-dev at python.org
>> https://mail.python.org/mailman/listinfo/ipython-dev
>>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/ipython-dev/attachments/20190128/e558a1aa/attachment-0001.html>


More information about the IPython-dev mailing list