Monday 25 January 2016

CPython Scripting in Pentaho Data Integration

Using the approach developed for integrating Python into Weka, Pentaho Data Integration (PDI) now has a new step that can be used to leverage the Python programming language (and its extensive package-based support for scientific computing) as part of a data integration pipeline. The step has been released to the community from Pentaho Labs and can be installed directly from PDI via the marketplace.


Python is becoming a serious contender to R when it comes to programming language choice for data scientists. In fact, many folks are leveraging the strengths of both languages when developing solutions. With that in mind, it is clear that data scientists and predictive application developers can boost productivity by leveraging the PDI + Python combo. As we all know, data preparation consumes the bulk of time in a typical predictive project. That data prep can typically be achieved more quickly in PDI, compared to developing code from scratch, thanks to its intuitive graphical development environment and extensive library of connectors and processing steps. Instead of having to write (and rewrite) code to connect to source systems (such as relational databases, NoSQL databases, Hadoop filesystems and so forth), and to join/filter/blend data etc., PDI allows the developer to focus their coding efforts on the cool data science-oriented algorithms.

CPython Script Executor

As the name suggests, the new step uses the C implementation of the Python programming language. While there are JVM-based solutions available - such as Jython - that allow a more tightly integrated experience when executing in the JVM, these do not facilitate the use of many high-powered Python libraries for scientific computing, due to the fact that such libraries include highly optimised components that are written in C or Fortran. In order to gain access to such libraries, the PDI step launches, and communicates with, a micro-service running in the C Python environment. Communication is done over plain sockets and messages are stored in JSON structures. Datasets are transmitted as CSV and the very fast routines for reading and writing CSV from the pandas Python package are leveraged.

The step itself offers maximum flexibility when it comes to dealing with data. It can act as a start point/data source in PDI (thus allowing the developer the freedom to source data directly via their Python code if so desired), or it can accept data from an upstream step and push it into the Python environment. In the latter case, the user can opt to send all incoming rows to Python in one hit, send fixed sized batches of rows, or send rows one-at-a-time. In any of these cases the data sent is considered a dataset, gets stored in a user-specified variable in Python, and the user's Python script is invoked. In the "all data" case, there is also the option to apply reservoir sampling to down-sample to a fixed size before sending the data to Python. The pandas DataFrame is used as the data structure for datasets transferred into Python.



A python script can be specified via the built-in editor, or loaded from a file dynamically at runtime. There are two scenarios for getting output from the Python environment to pass on to downstream PDI steps for further processing. The first (primary) scenario is when there is a single variable to retrieve from Python and it is a pandas DataFrame. In this case, the columns of the data frame become output fields from the step. In the second scenario, one or more non-data frame variables may be specified. In this case, their values are assumed to be textual (or can be represented as text) or contain image data (in which case they are retrieved from Python as binary PNG data). Each variable is output in a separate PDI field.


Requirements

The CPython Script Executor step will work with PDI >= 5.0. Of course, it requires Python to be installed and the python executable to be in your PATH environment variable. The step has been tested with Python 2.7 and 3.x and, at a minimum, needs the pandas, matplotlib and numpy libraries to be installed. For Windows users in particular, I'd recommend installing the excellent Anaconda python distribution. This includes the entire SciPy stack (including pandas and scikit-learn) along with lots of other libraries.

Example

The example transformation shown in the following screenshot can be obtained from here.

The example uses Fisher's classic iris data. The first python step (at the top) simply computes some quartiles for the numeric columns in the iris data. This is output from the step as a pandas DataFrame, where each row corresponds to one of the quartiles computed (25th, 50th and 75th), and each column holds the value for one of the numeric fields in the iris data. The second python step from the top uses the scikit-learn decomposition routine to compute a principal components analysis on the iris data and then transforms the iris data into the PCA space, which is then the output of the step. The third python step from the top uses the matplotlib library and plotting routines from the pandas library to compute some visualisations of the iris data (scatter plot matrix, Andrew's curves, parallel coordinates and rad-viz). These are then extracted as binary PNG data from the python environment and saved to files in the same directory as the transformation was loaded from. The two python steps at the bottom of the transformation learn a decision tree model and then use that model to score the iris data respectively. The model is saved (from the python environment) to the directory that the transformation was loaded from.


Conclusion

The new PDI CPython Script Executor step opens up the power of Python to the PDI developer and data scientist. It joins the R Script Executor and Weka machine learning steps in PDI as part of an expanding array of advanced statistical and predictive tools that can be leveraged within data integration processes.

229 comments:

  1. Great stuff. Thanks for sharing this!
    Question: Is it possible to pass a JSON string instead of a CSV?

    ReplyDelete
  2. Kettle rows are transferred as CSV and materialised as a pandas data frame on the Python side. If your json strings are stored as quoted kettle fields (escaping is probably necessary for quotes in the json itself) then I guess it should be possible to just pull the values out of the rows of the frame on the Python side.

    We could, in a future release, add an option to the row-by-row mode to simply assign the value of each field in the Kettle row to a separate variable in Python.

    Cheers,
    Mark.

    ReplyDelete
  3. I keep getting the error as at org.pentaho.di.trans.steps.cpythonscriptexecutor.CPythonScriptExecutor.executeScript(CPythonScriptExecutor.java:446)

    Any suggestions ?

    I am using Anaconda python.

    ReplyDelete
  4. I assume that you have the Anaconda bin directory in your PATH? Is Anaconda installed system-wide, or in your home directory? Some people have reported problems with a system-wide installation - something to do with file permissions preventing Anaconda from writing data.

    Is there a stack trace available on the console or in the PDI logs?

    Cheers,
    Mark.

    ReplyDelete
  5. Hi Mark!
    Thanks a lot for wonderful job.
    It is amazing opportunity ti incorporate pure Python script inside PDI.
    I use PDI 7.0 Community Edition on iOS X and Ubuntu.
    But sometimes I got a error after saving and trying to open CPythonExecutor step.
    "Enable to open dialog for this step"
    Argument cannot be null
    java.lang.IllegalArgumentException: Argument cannot be null
    at org.eclipse.swt.SWT.error(Unknown Source)
    at org.eclipse.swt.SWT.error(Unknown Source)
    at org.eclipse.swt.SWT.error(Unknown Source)
    at org.eclipse.swt.widgets.Widget.error(Unknown Source)
    at org.eclipse.swt.widgets.Text.setText(Unknown Source)
    at org.pentaho.di.ui.core.widget.TextVar.setText(TextVar.java:210)
    at org.pentaho.di.ui.trans.steps.cpythonscriptexecutor.CPythonScriptExecutorDialog.getData(CPythonScriptExecutorDialog.java:886)
    at org.pentaho.di.ui.trans.steps.cpythonscriptexecutor.CPythonScriptExecutorDialog.open(CPythonScriptExecutorDialog.java:249)
    at org.pentaho.di.ui.spoon.delegates.SpoonStepsDelegate.editStep(SpoonStepsDelegate.java:127)
    at org.pentaho.di.ui.spoon.Spoon.editStep(Spoon.java:8789)
    at org.pentaho.di.ui.spoon.trans.TransGraph.editStep(TransGraph.java:3179)
    at org.pentaho.di.ui.spoon.trans.TransGraph.mouseDoubleClick(TransGraph.java:775)
    at org.eclipse.swt.widgets.TypedListener.handleEvent(Unknown Source)
    at org.eclipse.swt.widgets.EventTable.sendEvent(Unknown Source)
    at org.eclipse.swt.widgets.Display.sendEvent(Unknown Source)
    at org.eclipse.swt.widgets.Widget.sendEvent(Unknown Source)
    at org.eclipse.swt.widgets.Widget.sendEvent(Unknown Source)
    at org.eclipse.swt.widgets.Widget.sendEvent(Unknown Source)
    at org.eclipse.swt.widgets.Widget.notifyListeners(Unknown Source)
    at org.eclipse.swt.widgets.Display.runDeferredEvents(Unknown Source)
    at org.eclipse.swt.widgets.Display.readAndDispatch(Unknown Source)
    at org.pentaho.di.ui.spoon.Spoon.readAndDispatch(Spoon.java:1359)
    at org.pentaho.di.ui.spoon.Spoon.waitForDispose(Spoon.java:7990)
    at org.pentaho.di.ui.spoon.Spoon.start(Spoon.java:9290)
    at org.pentaho.di.ui.spoon.Spoon.main(Spoon.java:685)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.pentaho.commons.launcher.Launcher.main(Launcher.java:92)

    ReplyDelete
    Replies
    1. Konstantin, I ran into that as well. I've submitted a fix to that to the pentaho-labs repo, but the commit that fixed it is here if you need it in the meantime:

      https://github.com/CodeSolid/pentaho-cpython-plugin/commit/0e2e9dc32cbd0759429ab17b45983cb6a57916c8

      Delete
    2. Hi John, that link is not available. Please let me know how you were able to resolve the issue.Thank you.

      Delete
  6. Hi
    Thanks for making this available. I'm using the CPython Scripting tab in the Explorer interface. I have loaded a csv file into Explorer and this is transferred as a py_data dataframe into the CPython tab and I can inspect the data. However, if I make changes to the dataframe, adding another column for instance, how do I update the data loaded into Explorer? I have tried to write out a csv file so I can load it in again, but this fails with an error. I wonder if you have an example?

    Thanks again,
    Mark.

    ReplyDelete
  7. Hi Mark,

    I have a script which decodes a URL , I just need to feed the input URL to it. Can you help me please
    below is the script
    #!python

    import sys
    import re
    import urllib.parse
    import html.parser

    def main():
    rewrittenurl = sys.argv[1]
    match = re.search(r'https://urldefense.proofpoint.com/(v[0-9])/', rewrittenurl)
    if match:
    if match.group(1) == 'v1':
    decodev1(rewrittenurl)
    elif match.group(1) == 'v2':
    decodev2(rewrittenurl)
    else:
    print('Unrecognized version in: ', rewrittenurl)

    else:
    print('No valid URL found in input: ', rewrittenurl)

    def decodev1 (rewrittenurl):
    match = re.search(r'u=(.+?)&k=',rewrittenurl)
    if match:
    urlencodedurl = match.group(1)
    htmlencodedurl = urllib.parse.unquote(urlencodedurl)
    url = html.parser.HTMLParser().unescape(htmlencodedurl)
    print(url)
    else:
    print('Error parsing URL')

    def decodev2 (rewrittenurl):
    match = re.search(r'u=(.+?)&[dc]=',rewrittenurl)
    if match:
    specialencodedurl = match.group(1)
    trans = str.maketrans('-_', '%/')
    urlencodedurl = specialencodedurl.translate(trans)
    htmlencodedurl = urllib.parse.unquote(urlencodedurl)
    url = html.parser.HTMLParser().unescape(htmlencodedurl)
    print(url)
    else:
    print('Error parsing URL')

    if __name__ == '__main__':
    main()
    I need to give rewrittenurl variable the output of my previous step in pentaho.(which is coming from a select values having field name S3link.
    Thanks

    ReplyDelete
  8. Hello

    I am using this step in Kettle and would like to test the use of Script in Python using this step by step of pandas / pivot table. But I'm not getting the same results.

    I used the example, I got some values, but in some cases I have a number format problem.

    I would like to know if there is something I have to change for the pivot table to work .... as the example in
    Http://pbpython.com/pandas-pivot-table-explained.html

    ReplyDelete
  9. can anyone give the exact steps to configure cpython in kettle

    ReplyDelete
    Replies
    1. It should be fairly straightforward. I recommend installing the Anaconda python distribution, because this comes with the few dependencies that are required for the step to operate. Beyond this, all that is necessary is that the python executable is in your PATH environment variable. Note that if you are using OS X you will need to start Spoon via a console using the spoon.sh script so that the PATH variable is picked up (I have not found a reliable way of specifying environment variables in the Info.plist file in an Mac application bundle yet).

      Another thing to watch out for is permissions on the python side. Some people have reported problems when their Anaconda distribution is installed system-wide. Python occasionally needs to write files and there can be problems unless you are running with elevated privileges. The best thing to do is to install Anaconda locally in your home directory.

      Cheers,
      Mark.

      Delete
  10. One annoying thing with the plugin is when the python side tries to ship a unicode string back and a Message Size io exception occurs. At this point (at least in Windows and Spoon) subsequent attempts to run the CPython step fail with a java.io socket exception and I have to restart Spoon to continue to work. This isn't a bug since when the dataframe is properly constructed with ASCII only it works fine. Not sure if that socket can be disposed of somehow. I haven't checked but I get the feeling there is a python process left spinning in limbo.

    ReplyDelete
    Replies
    1. There's a bug in the call to taskkill on Windows -- I've submitted a fix for it. So the restart may be be addressed by that fix for you. Not sure about the Message Size IOException however.

      Delete
  11. Yes, the micro server and the PDI side of things requires UTF-8. I'll have to return to the code at some stage and see if it is possible to recover from a catastrophic failure, kill any orphaned python process and establish a new socket.

    Cheers,
    Mark.

    ReplyDelete
  12. Hi Mark,

    thanks a lot for the great plugin. I am having trouble executing a ktr with a cpython step in it on a remote (headless server).

    Executing the ktr locally in PDI (spoon) is working fine, but one I use a remote server as run configuration it is complaining about missing plugin. The server is a headless linux box without the client tools. Is it possible to run a Cpython ktr on such an environment and where do I have to install the plugin?

    Thanks
    Christoph

    ReplyDelete
  13. Hi Christoph,

    What sort of server is this? Carte or BA server? I don't have a 7.x install at hand at the moment, only an old 6.1 version. Within that version of the BA server plugins for PDI can be installed in

    suite/server/data-integration-server/pentaho-solutions/system/kettle/plugins/

    Cheers,
    Mark.

    ReplyDelete
    Replies
    1. Hi Mark,

      thanks a lot for your reply. After copying the plugin into the right folder the server is not complaining about a missing plugin anymore. But I get a java null pointer exception without a lot more information:

      2017-10-20 07:11:06,417 INFO [org.pentaho.di] 2017/10/20 07:11:06 - 170817_Python_Test - Dispatching started for transformation [170817_Python_Test]
      2017-10-20 07:11:07,042 ERROR [org.pentaho.di] 2017/10/20 07:11:07 - CPython Script Executor.0 - ERROR (version 7.1.0.0-12, build 1 from 2017-05-16 17.18.02 by buildguy) : Unexpected error
      2017-10-20 07:11:07,043 ERROR [org.pentaho.di] 2017/10/20 07:11:07 - CPython Script Executor.0 - ERROR (version 7.1.0.0-12, build 1 from 2017-05-16 17.18.02 by buildguy) : java.lang.NullPointerException
      2017/10/20 07:11:07 - CPython Script Executor.0 - at org.pentaho.python.PythonSession.executeScript(PythonSession.java:479)
      2017/10/20 07:11:07 - CPython Script Executor.0 - at org.pentaho.di.trans.steps.cpythonscriptexecutor.CPythonScriptExecutor.executeScript(CPythonScriptExecutor.java:446)
      2017/10/20 07:11:07 - CPython Script Executor.0 - at org.pentaho.di.trans.steps.cpythonscriptexecutor.CPythonScriptExecutor.executeScriptAndProcessResult(CPythonScriptExecutor.java:349)
      2017/10/20 07:11:07 - CPython Script Executor.0 - at org.pentaho.di.trans.steps.cpythonscriptexecutor.CPythonScriptExecutor.processBatch(CPythonScriptExecutor.java:338)
      2017/10/20 07:11:07 - CPython Script Executor.0 - at org.pentaho.di.trans.steps.cpythonscriptexecutor.CPythonScriptExecutor.processRow(CPythonScriptExecutor.java:243)
      2017/10/20 07:11:07 - CPython Script Executor.0 - at org.pentaho.di.trans.step.RunThread.run(RunThread.java:62)
      2017/10/20 07:11:07 - CPython Script Executor.0 - at java.lang.Thread.run(Thread.java:748)

      python 3.5 is installed and added to the path system variable. The necessary packages are installed as well.
      Any hint what might be wrong?

      Cheers
      Christoph

      Delete
  14. Are you sure that the PATH environment variable is available to PDI running on the server (and contains the entry to python)? Perhaps create a simple transformation that prints the contents of PATH to a text file so that you can check.

    Cheers,
    Mark.

    ReplyDelete
  15. Hi Mark,

    I have checked it again and all prerequisists seem to be fullfilled. I have executed a simple job on the remote server with a shell step and the following script:

    echo $PATH
    which python
    python --version

    And here is the corresponding pentaho log on the server:

    2017-10-24 12:44:27,099 INFO [org.pentaho.di] 2017/10/24 12:44:27 - RepositoriesMeta - Reading repositories XML file: /pentaho/.kettle/repositories.xml
    2017-10-24 12:44:27,100 INFO [org.pentaho.di] 2017/10/24 12:44:27 - PurRepository - Creating repository meta store interface
    2017-10-24 12:44:27,104 INFO [org.pentaho.di] 2017/10/24 12:44:27 - PurRepository - Connected to the enterprise repository
    2017-10-24 12:44:27,104 INFO [org.pentaho.di] 2017/10/24 12:44:27 - Connected to AWS Pentaho Repo as admin
    2017-10-24 12:44:27,172 INFO [org.pentaho.di] 2017/10/24 12:44:27 - PurRepository - Creating repository meta store interface
    2017-10-24 12:44:27,173 INFO [org.pentaho.di] 2017/10/24 12:44:27 - PurRepository - Connected to the enterprise repository
    2017-10-24 12:44:27,218 INFO [org.pentaho.di] 2017/10/24 12:44:27 - my_shell_test - Start of job execution
    2017-10-24 12:44:27,238 INFO [org.pentaho.di] 2017/10/24 12:44:27 - my_shell_test - Starting entry [Shell]
    2017-10-24 12:44:27,245 INFO [org.pentaho.di] 2017/10/24 12:44:27 - Shell - Running on platform : Linux
    2017-10-24 12:44:27,255 INFO [org.pentaho.di] 2017/10/24 12:44:27 - Shell - Executing command : /pentaho/server/pentaho-server/tomcat/temp/kettle_123c6e0a-b8b9-11e7-af93-19d8b78f0f78shell
    2017-10-24 12:44:27,258 INFO [org.pentaho.di] 2017/10/24 12:44:27 - Shell - (stdout) /usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games:/snap/bin:/usr/lib/jvm/java-8-oracle/bin:/usr/lib/jvm/java-8-oracle/db/bin:/usr/lib/jvm/java-8-oracle/jre/bin
    2017-10-24 12:44:27,258 INFO [org.pentaho.di] 2017/10/24 12:44:27 - Shell - (stdout) /usr/bin/python
    2017-10-24 12:44:27,259 INFO [org.pentaho.di] 2017/10/24 12:44:27 - Shell - (stdout) Python 3.5.2
    2017-10-24 12:44:27,269 INFO [org.pentaho.di] 2017/10/24 12:44:27 - my_shell_test - Finished job entry [Shell] (result=[true])
    2017-10-24 12:44:27,270 INFO [org.pentaho.di] 2017/10/24 12:44:27 - my_shell_test - Job execution finished

    The PATH environment variable can be resolved and contains /usr/bin where python is located and the python --version shows the version 3.5.2 I have installed.

    But the simple python script:
    # python script
    import pandas as pd

    X=15

    Generates a null pointer exception. Any ideas on how to debug that problem?

    Cheers
    Christoph

    ReplyDelete
  16. Hmm. The next question would be is python on the server setup the same as on your desktop machine? I.e. does it have the required python packages installed that the PDI plugin requires - pandas, numpy, matplotlib etc.? If you ssh to the server and run the transformation from the data integration installation using the pan command line tool does it work?

    Cheers,
    Mark.

    ReplyDelete
    Replies
    1. Hi Mark,

      thank you very much for your support! I had not been aware of the matplotlib requirement. I had installed python using anaconda locally on my laptop, but not on the server.

      You probably could update the blog post requirements section to include numpy and matplotlib as mandatory.

      Thanks again for your great plugin and support. It´s really appreciated.

      Cheers
      Christoph

      Delete
    2. Hi Chris,

      Good point :-) I've updated the requirements section. I'm glad to hear that it worked on the server in the end.

      Cheers,
      Mark.

      Delete
  17. Thank you Mark, you built the bridge that was missing on PDI

    ReplyDelete
  18. Hello Mark,


    What I did so far:
    - The transformation is running under root privileges (PDI5.3 Ubuntu16.04)
    - Python 2 and 3 are both on PATH
    - All dependencies were installed for both
    - The pyCheck.py and pyServer.py were extracted from .jar file and manually tested on python 2 and 3
    - command: "# python /var/pyCheck.py" (OK ExitStatus:0 Zero dependency errors)
    - command: "# python /var/pyServer.py" (OK ExitStatus:1 Zero dependency errors)
    - The same 2 commands above, was executed under a .kjb, inside 2 "ExecuteAShellScript" steps (No Dependency errors, either)
    - The most recent github's CPythonScriptExecutor was downloaded and compiled.
    - Even the simplest python code (eg: whatever=1+1) generates error

    The stacktrace is always the same:
    ERROR (version 5.3.0.0-213, build 1 from 2015-02-02_12-17-08 by buildguy) : Unexpected error
    ERROR (version 5.3.0.0-213, build 1 from 2015-02-02_12-17-08 by buildguy) : java.lang.NullPointerException
    at org.pentaho.python.PythonSession.rowsToPythonDataFrame(PythonSession.java:409)
    at org.pentaho.di.trans.steps.cpythonscriptexecutor.CPythonScriptExecutor.rowsToPyDataFrame(CPythonScriptExecutor.java:458)
    at org.pentaho.di.trans.steps.cpythonscriptexecutor.CPythonScriptExecutor.processBatch(CPythonScriptExecutor.java:276)
    at org.pentaho.di.trans.steps.cpythonscriptexecutor.CPythonScriptExecutor.processRow(CPythonScriptExecutor.java:243)
    at org.pentaho.di.trans.step.RunThread.run(RunThread.java:62)
    at java.lang.Thread.run(Thread.java:748)
    Finished processing (I=0, O=0, R=1, W=0, U=0, E=1)

    I've spent all day long trying to get this step running and I have no idea of what else can be done. I really appreciate any tip that would help to trackdown why this NPE is occurring.


    Thank you

    ReplyDelete
  19. Hi margenn,

    Was this running the example transformation from the github project? And I assume that you set the PATH from a terminal, and started PDI from spoon.sh? NPEs are almost always associated with an environment that is not configured correctly.

    Cheers,
    Mark.

    ReplyDelete
  20. Hi Mark,

    Thank a lot for this plugin. On a windows machine everything works fine (Pentaho 8.0), but on my AWS linux machine for some reason it won't work properly(also Pentaho 8.0).
    The OS is Ubuntu 16.04 with python3.6.3 (including matplotlib 2.1.1,numpy 1.13.3,pandas 0.21.1 and sklearn 0.19.1). Pentaho is running with root permissions.

    I get the following error:
    2018/01/29 09:30:47 - CPython Script Executor - Was unable to start python server
    2018/01/29 09:30:47 - CPython Script Executor - ERROR (version 8.0.0.0-28, build 8.0.0.0-28 from 2017-11-05 07.27.50 by buildguy) : org.pentaho.di.core.exception.KettleException:
    2018/01/29 09:30:47 - CPython Script Executor - java.io.IOException: Was unable to start python server
    2018/01/29 09:30:47 - CPython Script Executor - Was unable to start python server

    I've spent some time trying to get the CPython Script Executor step running but at this moment I have no idea what might be wrong. It seems like it is some kind of JAVA motivated Error. Have I missed anything crutial for the setup?

    Cheers,
    Matthias

    ReplyDelete
  21. Hi Matthias,

    There is also a dependency on scipy. Since sklearn depends on this, I always assumed that pip would install it too when sklearn is installed - not so I found out just recently :-)

    Anyhow, try installing scipy. Other dependencies (which you should have already) include: io (python 3), StringIO (python 2), math, traceback, socket, struct, os. json, base64, and pickle.

    I always use Anaconda python, which has everything and the kitchen sink out of the box.

    Cheers,
    Mark.

    ReplyDelete
  22. Hi Mark,
    Thank you for your answer. I checked all of those dependencies and there all installed and up to date. I also tried to use Anaconda python (as recommended) but unfortunately I still get the following error:

    2018/01/31 15:04:31 - CPython Script Executor.0 - ERROR (version 8.0.0.0-28, build 8.0.0.0-28 from 2017-11-05 07.27.50 by buildguy) :
    2018/01/31 15:04:31 - CPython Script Executor.0 - java.io.IOException: Was unable to start python server
    2018/01/31 15:04:31 - CPython Script Executor.0 - Was unable to start python server
    2018/01/31 15:04:31 - CPython Script Executor.0 - ERROR (version 8.0.0.0-28, build 8.0.0.0-28 from 2017-11-05 07.27.50 by buildguy) : org.pentaho.di.core.exception.KettleException:
    2018/01/31 15:04:31 - CPython Script Executor.0 - java.io.IOException: Was unable to start python server
    2018/01/31 15:04:31 - CPython Script Executor.0 - Was unable to start python server


    Cheers,
    Matthias

    ReplyDelete
    Replies
    1. Try "yum install PyQt4 tkinter".

      Delete
    2. I reinstalled pyqt4 and tkinter but unfortunately it didnt help. There remains the java.io.IOException: Was unable to start python server

      Delete
    3. One of our support guys had a similar issue under Ubuntu. He was able to get the step working after executing:

      sudo apt-get install python3-tk

      Cheers,
      Mark.

      Delete
    4. Hi, did you manage to fix this, I'm struggling with the same issue

      Delete
  23. Hi Mark, thank you for the plugin.
    I'm using PDI 8.0 CE on Windows X. When I try to connect the Cpython step with another I get the following error:
    java.lang.ClassCastException: java.base/java.lang.String cannot be cast to org.pentaho.di.trans.step.StepMeta
    at org.pentaho.di.trans.step.StepMeta.equals(StepMeta.java:537)
    at org.pentaho.di.ui.spoon.trans.TransGraph.mouseDoubleClick(TransGraph.java:801)
    at org.eclipse.swt.widgets.TypedListener.handleEvent(Unknown Source)
    at org.eclipse.swt.widgets.EventTable.sendEvent(Unknown Source)
    at org.eclipse.swt.widgets.Display.sendEvent(Unknown Source)
    at org.eclipse.swt.widgets.Widget.sendEvent(Unknown Source)
    at org.eclipse.swt.widgets.Display.runDeferredEvents(Unknown Source)
    at org.eclipse.swt.widgets.Display.readAndDispatch(Unknown Source)
    at org.pentaho.di.ui.spoon.Spoon.readAndDispatch(Spoon.java:1366)
    at org.pentaho.di.ui.spoon.Spoon.waitForDispose(Spoon.java:7984)
    at org.pentaho.di.ui.spoon.Spoon.start(Spoon.java:9245)
    at org.pentaho.di.ui.spoon.Spoon.main(Spoon.java:692)
    at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
    at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
    at java.base/java.lang.reflect.Method.invoke(Unknown Source)
    at org.pentaho.commons.launcher.Launcher.main(Launcher.java:92)

    what went wrong?
    Cheers,
    Guido

    ReplyDelete
  24. Very strange! I've never seen this error before, and can't reproduce it using my copy of PDI 8. Do you have the latest version of the CPython step (v1.4) installed from the marketplace?

    Cheers,
    Mark.

    ReplyDelete
    Replies
    1. I have the latest version, I have downloaded it from the marketplace because my spoon version does not show the marketplace button. When I try to connect the Cpython step to the previous step the program says that no input are recived from the Cpython step.

      Delete
  25. This comment has been removed by the author.

    ReplyDelete
  26. Hi Mark,

    I have same problem, but OS is Linux Redhat 7, PDI 8 and Pentaho Server 8, python 2.7 (all library in requirement) and communication fail:

    2018-04-11 18:42:42,961 ERROR [org.pentaho.di] 2018/04/11 18:42:42 - CPython Script Executor.0 - ERROR (version 8.0.0.0-28, build 8.0.0.0-28 from 2017-11-05 07.27.50 by buildguy) : org.pentaho.di.core.exception.KettleException:
    2018/04/11 18:42:42 - CPython Script Executor.0 - java.io.IOException: Was unable to start python server
    2018/04/11 18:42:42 - CPython Script Executor.0 - Was unable to start python server


    In Windows execute success!!

    Any help on problem?

    Thanks,
    Yerko.

    ReplyDelete
  27. I simply wanted to write down a quick word to say thanks to you for those wonderful tips and hints you are showing on this site.
    It’s great to come across a blog every once in a while that isn’t the same out of date rehashed material. Fantastic read.

    Python Training in Chennai | Python Training Institutes in Chennai

    ReplyDelete
  28. Mark, I'm getting a ` java.lang.ArrayIndexOutOfBoundsException` when the dataframe I pass back is empty (edge case), is it not ok to pass empty dataframes into the stream?

    Thanks - brez

    ReplyDelete
  29. the blog is good and Interactive it is about Mulesoft it is useful for students and Mulesoft Developers for more updates on Mulesoft mulesoft Online training

    ReplyDelete
  30. After reading this blog i very strong in this topics and this blog really helpful to all Data Science online Training Hyderabad

    ReplyDelete
  31. This comment has been removed by the author.

    ReplyDelete
  32. Dear Mark,

    I discovered that, when using CPython Scripting in Pentaho, the values in the data tables(frames) entering Python codes at a Cpython Pentaho step, differ totally from output of the step's previous step. For example, a 200,000 rows table flows into Cpython step, but only 200 rows came into the Python codes. (Because I placed Python codes for writing output to my PC within the Cpython scritps, therefore I can see the values of the variables in Cpython. )
    Besides, the values in each column within the 100 row table, differ from the output of the previous step. For instance, various string values in the original data table, become "value1", "value2", etc.
    The reason I care about how Cpython communicates with Pentaho is that, the Cpython generates totally different results from the results generated from pure Python codes.
    Besides, I am not able to get output from Cpython Pentaho steps which flow into following steps-- I can only generate output via Python codes within Cpython scripts. From Pentaho step metrics interface, I see only inflow into Cpython but no outflow. And Cpython step is pending with no finishing status.

    ReplyDelete
  33. Inspiring writings and I greatly admired what you have to say , I hope you continue to provide new ideas for us all and greetings success always for you..Keep update more information..
    Data Science Training in Chennai
    Data Science course in anna nagar
    Data Science course in chennai
    Data science course in Bangalore
    Data Science course in marathahalli

    ReplyDelete
  34. Thanks for splitting your comprehension with us. It’s really useful to me & I hope it helps the people who in need of this vital information. 
    Best Devops Training in pune
    Devops Training in Bangalore
    Microsoft azure training in Bangalore
    Power bi training in Chennai

    ReplyDelete
  35. After seeing your article I want to say that the presentation is very good and also a well-written article with some very good information which is very useful for the readers....thanks for sharing it and do share more posts like this.
    Data Science course in kalyan nagar
    Data Science course in OMR
    Data Science course in chennai
    Data science course in velachery
    Data science course in jaya nagar
    Data Science interview questions and answers
    Data science course in bangalore

    ReplyDelete
  36. Thanks for posting useful information.You have provided an nice article, Thank you very much for this one. And i hope this will be useful for many people.. and i am waiting for your next post keep on updating these kinds of knowledgeable things...Really it was an awesome article...very interesting to read..please sharing like this information......
    lenovo service center in vadapalani

    ReplyDelete
  37. Hi Mark,
    i install CPython Executor via marketplace on pdi-ce-8.2 on ubuntu 18. I install via pip all the libraries. When i launch pyTest1(the example you describe) from Spoon it gave me those errors:
    CPython Script Executor 2.0 - There was a problem initializing the python environment:
    CPython Script Executor 2.0 - Library "scipy" is not available
    CPython Script Executor 2.0 - Library "sklearn" is not available
    CPython Script Executor 2.0 - Library "matplotlib" is not avail.
    CPython Script Executor 2.0 - Library "numpy" is not available
    CPython Script Executor 2.0 - Library "pandas" is not available

    There is a way to solve this issue?
    thank you

    ReplyDelete
  38. Hi Mark
    Ive installed Pentaho and used it to create transformations with success, but after I've used Cpython script executor I get the following error
    2019/05/07 11:40:59 - CPython Script Executor.0 - ERROR (version 8.2.0.0-342, build 8.2.0.0-342 from 2018-11-14 10.30.55 by buildguy) :
    2019/05/07 11:40:59 - CPython Script Executor.0 - java.io.IOException: Was unable to start python server
    2019/05/07 11:40:59 - CPython Script Executor.0 - Was unable to start python server.
    Can you please guide?

    ReplyDelete
    Replies
    1. I've installed Numpy and matplotlib and all the dependencies as well. Thanks in advance

      Delete
    2. I had same problem and resolved it by setting Windows permissions to "Everyone - Full Control" on the Python folder (C:\Python38)

      Delete
    3. Were you able to solve this? Even I am facing the same

      Delete
  39. Attend The Python training in bangalore From ExcelR. Practical Python training in bangalore Sessions With Assured Placement Support From Experienced Faculty. ExcelR Offers The Python training in bangalore.
    python training in bangalore

    ReplyDelete
  40. I like viewing web sites which comprehend the price of delivering the excellent useful resource Python classes in pune free of charge. I truly adored reading your posting. Thank you!

    ReplyDelete
  41. This comment has been removed by the author.

    ReplyDelete
  42. I am getting the below error, when am trying to run

    CPython Script Executor.0 - ERROR (version 8.1.0.0-365, build 8.1.0.0-365 from 2018-04-30 09.42.24 by buildguy) :
    2019/07/29 14:40:08 - CPython Script Executor.0 - There was a problem initializing the python environment:

    Library "sklearn" is not available
    2019/07/29 14:40:08 - CPython Script Executor.0 - ERROR (version 8.1.0.0-365, build 8.1.0.0-365 from 2018-04-30 09.42.24 by buildguy) : org.pentaho.di.core.exception.KettleException:
    2019/07/29 14:40:08 - CPython Script Executor.0 - There was a problem initializing the python environment:

    Library "sklearn" is not available

    ReplyDelete
  43. i am able to import all required packages on local set up, i am new to pentaho but i have been working on python, so i am sure this is not a python set up issue.
    >>> from scipy.fftpack import fft

    >>> import scipy.integrate

    >>> from scipy import interpolate

    >>> from sklearn import datasets

    >>> from sklearn import metrics

    >>> import scipy

    >>> import numpy

    >>> import pandas

    >>> from scipy.constants import pi

    >>> from spicy.fftpack import fft



    I am getting the below error, when am trying to run

    CPython Script Executor.0 - ERROR (version 8.1.0.0-365, build 8.1.0.0-365 from 2018-04-30 09.42.24 by buildguy) :
    2019/07/29 14:40:08 - CPython Script Executor.0 - There was a problem initializing the python environment:

    Library "sklearn" is not available
    2019/07/29 14:40:08 - CPython Script Executor.0 - ERROR (version 8.1.0.0-365, build 8.1.0.0-365 from 2018-04-30 09.42.24 by buildguy) : org.pentaho.di.core.exception.KettleException:
    2019/07/29 14:40:08 - CPython Script Executor.0 - There was a problem initializing the python environment:

    Library "sklearn" is not available

    ReplyDelete
  44. business analytics course with placement course is an extremely popular, in-demand profession which requires a professional to possess sound knowledge of analysing data in all dimensions and uncover the unseen truth coupled with logic and domain knowledge to impact the top-line (increase business) and bottom-line (increase revenue)

    ReplyDelete
  45. ExcelR offers data science training in hyderabad , the most comprehensive Data Science course in the market, covering the complete Data Science lifecycle concepts from Data Collection, Data Extraction, Data Cleansing, Data Exploration, Data Transformation, Feature Engineering, Data Integration, Data Mining, building Prediction models.

    ReplyDelete
  46. Such as very excellent information, it helps us to improve our mind knowledge. Keep on updating like this information!!
    machine learning course bangalore

    ReplyDelete
  47. Nice article
    For Data Science training in bangalore, visit:
    Data science training in bangalore

    ReplyDelete
  48. Thanks Mike, It's great tutorial. Is it possible to do Web Scraping in Pentaho?

    ReplyDelete
  49. nice blog
    get best placement at VSIPL

    digital marketing services
    web development company
    seo network point

    ReplyDelete
  50. Attend The Course in Data Analytics From ExcelR. Practical Course in Data Analytics Sessions With Assured Placement Support From Experienced Faculty. ExcelR Offers The Course in Data Analytics.
    ExcelR Course in Data Analytics

    ReplyDelete
  51. For Devops training in Bangalore Visit:
    Devops training in bangalore

    ReplyDelete
  52. Nice Post...I have learn some new information.thanks for sharing. Artificial Intelligence Course

    ReplyDelete
  53. For IOT Training in Bangalore Visit: IOT Training in Bangalore

    ReplyDelete
  54. Hi,
    This plugin is fantastic !! Congrats !!

    I'm try to use this component to a especific situation but evertime PDI return this error:
    java.net.SocketException: Broken pipe (Write failed)
    2019/10/14 10:40:15 - CPython Script Executor 2.0 - Broken pipe (Write failed)

    ReplyDelete
  55. Really nice post. Thank you for sharing amazing information and i had found the very best mulesoft training with 18+ years experienced faculty with mulesoft training videos .
    Contact no:- 9885022027

    ReplyDelete
  56. Thanks for sharing information
    "Yaaron media is one of the rapidly growing digital marketing company in Hyderabad,india.Grow your business or brand name with best online, digital marketing companies in ameerpet, Hyderabad. Our Services digitalmarketing, SEO, SEM, SMO, SMM, e-mail marketing, webdesigning & development, mobile appilcation.
    "
    Best web designing companies in Hyderabad
    Best web designing & development companies in Hyderabad
    Best web development companies in Hyderabad

    ReplyDelete
  57. Very interesting blog Thank you for sharing such a nice and interesting blog and really very helpful article.html training in bangalore

    ReplyDelete
  58. Very useful and information content has been shared out here, Thanks for sharing it.php training in bangalore

    ReplyDelete
  59. I gathered a lot of information through this article.Every example is easy to undestandable and explaining the logic easily.mysql training in bangalore

    ReplyDelete
  60. These provided information was really so nice,thanks for giving that post and the more skills to develop after refer that post.javascript training in bangalore

    ReplyDelete
  61. Your articles really impressed for me,because of all information so nice.angularjs training in bangalore

    ReplyDelete
  62. Linking is very useful thing.you have really helped lots of people who visit blog and provide them use full information.angular 2 training in bangalore

    ReplyDelete
  63. I know that it takes a lot of effort and hard work to write such an informative content like this.node.js training in bangalore

    ReplyDelete
  64. I curious more interest in some of them hope you will give more information on this topics in your next articles.
    data analytics courses

    ReplyDelete
  65. A debt of gratitude is in order for sharing the information, keep doing awesome... I truly delighted in investigating your site. great asset...
    Please check ExcelR Data Science Courses

    ReplyDelete
  66. You completed a number of nice points there. I did a search on the issue and found ExcelR Data Analytics Course Pune nearly all people will have the same opinion with your blog.

    ReplyDelete
  67. I would like to thank you for the efforts you have made in writing this article. I am hoping the same best work from you in the future as well. In fact your creative writing abilities has inspired me to start my own Blog Engine blog now. Really the blogging is spreading its wings rapidly. Your write up is a fine example of it. data science course

    ReplyDelete
  68. I am so proud of you and your efforts and work make me realize that anything can be
    done with patience and sincerity. Well I am here to say that your work has inspired me without a doubt. Here is i want to share
    about mulesoft training online with Free Bundle videos .

    ReplyDelete
  69. After reading your article I was amazed. I know that you explain it very well. And I hope that other readers will also experience how I feel after reading your article.
    ExcelR Data Analytics Course

    ReplyDelete
  70. I’m excited to uncover this page. I need to to thank you for ones time for this particularly fantastic read !! I definitely really liked every part of it and i also have you saved to fav to look at new information in your site.
    Please check ExcelR Data Science Certification

    ReplyDelete
  71. Thank you for sharing such a great information.Its really nice and informative.hope more posts from you. I also want to share some information recently i have gone through and i had find the one of the best mulesoft 4 self training

    ReplyDelete
  72. "Just saying thanks will not just be sufficient, for the fantastic lucidity in your writing. I will instantly grab your articles to get deeper into the topic. And as the same way ExcelR also helps organisations by providing data science courses based on practical knowledge and theoretical concepts. It offers the best value in training services combined with the support of our creative staff to provide meaningful solution that suits your learning needs.

    Business Analytics Courses "

    ReplyDelete
  73. This post is very simple to read and appreciate without leaving any details out. Great work! best institute for data science in bangalore

    ReplyDelete
  74. This post is very simple to read and appreciate without leaving any details out. Great work! best institute for data science in bangalore

    ReplyDelete
  75. You have done a great job, There are may person searching about that topic. now they will easily find your post
    UI Path Online Training
    UI Path Training in Hyderabad

    ReplyDelete
  76. I have to search sites with relevant information on given topic and provide them to teacher our opinion and the article.
    business analytics courses
    data science interview questions

    ReplyDelete
  77. I finally found great post here.I will get back here. I just added your blog to my bookmark sites. thanks.Quality posts is the crucial to invite the visitors to visit the web page, that's what this web page is providing.
    data analytics courses Mumbai
    data science interview questions

    ReplyDelete
  78. You actually make it look so easy with your performance but I find this matter to be actually something which I think I would never comprehend. It seems too complicated and extremely broad for me. I'm looking forward for your next post, I’ll try to get the hang of it!
    Machine Learning courses in Mumbai
    artificial intelligence course in mumbai

    ReplyDelete

  79. Nice article I was impressed by seeing this blog, it was very interesting and it is very useful for me.
    UI Path Online Training
    UI Path Training in Hyderabad

    ReplyDelete
  80. I just got to this amazing site not long ago. I was actually captured with the piece of resources you have got here. Big thumbs up for making such wonderful blog page!
    data analytics courses in Mumbai

    data science interview questions

    business analytics courses

    data science course in mumbai

    ReplyDelete


  81. I finally found great post here.I will get back here. I just added your blog to my bookmark sites. thanks.Quality posts is the crucial to invite the visitors to visit the web page, that's what this web page is providing.
    ExcelR Data Science course Mumbai
    ExcelR data analytics courses in Mumbai
    data science interview questions
    ExcelR Business Analytics courses in Mumbai

    ReplyDelete
  82. Hi Mark, I managed to set up the PDI environment to run python, I even managed to run a test script, but how do I use the data from the previous step inside the python script?

    ReplyDelete

  83. I finally found great post here.I just added your blog to my bookmark sites. thanks.Quality posts is the crucial to invite the visitors to visit the web page, that's what this web page is providing.
    machine learning course in pune

    ReplyDelete
  84. I really enjoy simply reading all of your weblogs. Simply wanted to inform you that you have people like me who appreciate your work. Definitely a great post. Hats off to you! The information that you have provided is very helpful.
    data analytics course

    ReplyDelete
  85. Your blog is splendid, I follow and read continuously the blogs that you share, they have some really important information. M glad to be in touch plz keep up the good work.
    Data Scientist Courses

    ReplyDelete
  86. Very nice job... Thanks for sharing this amazing and educative blog post! ExcelR Data Analytics Courses

    ReplyDelete
  87. So luck to come across your excellent blog. Your blog brings me a great deal of fun.. Good luck with the site. ExcelR Data Science Courses

    ReplyDelete
  88. Expected to form you an almost no word to thank you once more with respect to the decent recommendations you've contributed here.
    Machine Learning Training In Hyderabad

    ReplyDelete
  89. Good content. The explanation of content explained very neat.
    Data Science Training In Hyderabad

    ReplyDelete
  90. Wonderful post, i loved reading it.
    Share more
    Provenexpert
    Thingiverse
    Instapaper

    ReplyDelete
  91. Thanks for provide great informatic and looking beautiful blog, really nice required information & the things i never imagined and i would request, wright more blog and blog post like that for us. Thanks you once agian
    we offer services birth certificate in delhi which inculde name add in birth certificate and birth certificate correction complete process is online and we offer birth certificate onlineand we offer this birth certificate apply online same service offers at yourdoorstep at birth certificate in ghaziabad our dream to provide birth certificate in india and other staes like birth certificate in bengaluru and birth certificate in gurgaon book service with us birth certificate in noida also, service at yoursdoorstep only.

    ReplyDelete
  92. I have to search sites with relevant information on given topic and provide them to teacher our opinion and the article.

    data science interview questions

    ReplyDelete
  93. Very interesting blog Thank you for sharing such a nice and interesting blog and really very helpful article.

    Oracle Apps HRMS Training in Bangalore

    Best Oracle Apps HRMSTraining Institutes in Bangalore

    ReplyDelete
  94. Really nice and interesting post. I was looking for this kind of information and enjoyed reading this one. Keep posting. Thanks for sharing.

    artificial intelligence course in bangalore

    ReplyDelete
  95. I like viewing web sites which comprehend the price of delivering the excellent useful resource free of charge. I truly adored reading your posting. Thank you!

    Simple Linear Regression

    Correlation vs covariance

    KNN Algorithm

    ReplyDelete
  96. Very nice blogs!!! i have to learning for lot of information for this sites...Sharing for wonderful information.Thanks for sharing this valuable information to our vision. You have posted a trust worthy blog keep sharing, data science online course

    ReplyDelete
  97. Found your post interesting to read. I cant wait to see your post soon. Good Luck for the upcoming update. This article is really very interesting and effective, data science course

    ReplyDelete
  98. Good Post! , it was so good to read and useful to improve my knowledge as an updated one, keep blogging. After seeing your article I want to say that also a well-written article with some very good information which is very useful for the readers....thanks for sharing it and do share more posts like this. https://www.3ritechnologies.com/course/salesforce-training-in-pune/

    ReplyDelete
  99. Very nice blogs!!! i have to learning for lot of information for this sites…Sharing for wonderful information.Thanks for sharing this valuable information to our vision. You have posted a trust worthy blog keep sharing, data science online course

    ReplyDelete
  100. Very nice blogs!!! i have to learning for lot of information for this sites…Sharing for wonderful information.Thanks for sharing this valuable information to our vision. You have posted a trust worthy blog keep sharing, data scientist course in hyderabad with placement

    ReplyDelete
  101. Very nice blogs!!! i have to learning for lot of information for this sites…Sharing for wonderful information.Thanks for sharing this valuable information to our vision. You have posted a trust worthy blog keep sharing, data scientist course in hyderabad with placement

    ReplyDelete
  102. Very nice blogs!!! i have to learning for lot of information for this sites…Sharing for wonderful information.Thanks for sharing this valuable information to our vision. You have posted a trust worthy blog keep sharing, best online data science courses

    ReplyDelete
  103. Very nice blogs!!! i have to learning for lot of information for this sites…Sharing for wonderful information.Thanks for sharing this valuable information to our vision. You have posted a trust worthy blog keep sharing, data science online training

    ReplyDelete
  104. Very nice blogs!!! i have to learning for lot of information for this sites…Sharing for wonderful information.Thanks for sharing this valuable information to our vision. You have posted a trust worthy blog keep sharing, data science courses

    ReplyDelete
  105. I have to search sites with relevant information on given topic and provide them to teacher our opinion and the article.

    Simple Linear Regression

    Correlation vs covariance

    KNN Algorithm

    Logistic Regression explained

    ReplyDelete
  106. I have express a few of the articles on your website now, and I really like your style of blogging. I added it to my favorite’s blog site list and will be checking back soon…
    Data Science Courses It is the intent to provide valuable information and best practices, including an understanding of the regulatory process. Your work is very good, and I appreciate you and hopping for some more informative posts

    ReplyDelete
  107. Very nice blogs!!! I have to learning for lot of information for this sites…Sharing for wonderful information. Thanks for sharing this valuable information to our vision. You have posted a trust worthy blog keep sharing, data science course

    ReplyDelete
  108. Very nice blogs!!! i have to learning for lot of information for this sites…Sharing for wonderful information. Thanks for sharing this valuable information to our vision. You have posted a trust worthy blog keep sharing, data science training

    ReplyDelete
  109. Very nice blogs!!! i have to learning for lot of information for this sites…Sharing for wonderful information. Thanks for sharing this valuable information to our vision. You have posted a trust worthy blog keep sharing, data science training

    ReplyDelete
  110. Greetings Mark, thanks for such a great step to work with Python, I annoy you with a query, how could I do to add two columns or variables of a data frame read row by row, and store this value in a third column or variable, but inside of the same data frame

    ReplyDelete
  111. Very nice blogs!!! i have to learning for lot of information for this sites…Sharing for wonderful information. Thanks for sharing this valuable information to our vision. You have posted a trust worthy blog keep sharing, data scientist courses

    ReplyDelete
  112. This Was An Amazing! I Haven't Seen This Type of Blog Ever! Thank you For Sharing, data science certification

    ReplyDelete
  113. Thank you for your post, I look for such article along time, today i find it finally. this post give me lots of advise it is very useful for me !data science training in Hyderabad

    ReplyDelete
  114. I am really happy to say it’s an interesting post to read . I learn new information from your article , you are doing a great job . Keep it up

    Devops Training in Hyderabad

    Hadoop Training in Hyderabad

    Python Training in Hyderabad

    ReplyDelete

  115. keep up the good work. this is an Ossam post. This is to helpful, i have read here all post. i am impressed. thank you. this is our site please visit to know more information
    data science courses

    ReplyDelete
  116. I am really enjoying reading your well written articles. It looks like you spend a lot of effort and time on your blog. I have bookmarked it and I am looking forward to reading new articles. Keep up the good work.data scientist course in pune

    ReplyDelete
  117. I am really enjoying reading your well written articles. It looks like you spend a lot of effort and time on your blog. I have bookmarked it and I am looking forward to reading new articles. Keep up the good work.data scientist course in pune

    ReplyDelete

  118. Your work is very good and I appreciate you and hopping for some more informative posts. ExcelR Data Science Course In Pune

    ReplyDelete
  119. I can set up my new thought from this post. It gives inside and out data. A debt of gratitude is in order for this significant data for all, pleasant bLog! its fascinating. much obliged to you for sharing. ExcelR Data Analytics Courses

    ReplyDelete

  120. I surely acquiring more difficulties from each surprisingly more little bit of it ExcelR Data Analytics Courses

    ReplyDelete
  121. I really thank you for the valuable info on this great subject and look forward to more great posts ExcelR Business Analytics Courses

    ReplyDelete
  122. ExcelR provides Data Analytics courses. It is a great platform for those who want to learn and become a Data Analytics course. Students are tutored by professionals who have a degree in a particular topic. It is a great opportunity to learn and grow.


    Data Analytics courses

    ReplyDelete
  123. This is a splendid website! I"m extremely content with the remarks!ExcelR Business Analytics Courses

    ReplyDelete
  124. I want you to thank for your time of this wonderful read!!! I definitely enjoy every little bit of it and I have you bookmarked to check out new stuff of your blog a must read blog! ExcelR Data Analyst Course

    ReplyDelete
  125. Digital Marketing Services
    It is Digital marketing performed using various digital channels such as search engines, websites, social media, mobile phones, emails, and mobile apps. Digital marketing is a very wide range with various sub-sectors. These sub-sectors are equally important in making your website number one, enhancing brand visibility, generating more leads, a better ROI, etc. We delivers the best Digital Marketing Services in Bhopal.
    Digital Marketing Training company in bhopal
    Digital Marketing Training in bhopal

    ReplyDelete
  126. Hello,

    Thanks for the work!

    I have a question : How do you replace argumens by data coming from a input table?

    ReplyDelete
  127. Thank you for excellent article.You made an article that is interesting. Nurture through nature

    ReplyDelete
  128. Thanks really helpful and informative content,keep writing and sharing with us.
    Data Science Training in Pune

    ReplyDelete
  129. AI Patasala's exclusive Machine Learning Institutes in Hyderabadd program will help you master the core cognitive skills of Machine Learning technology. So be a Machine Learning expert.

    ReplyDelete
  130. Automated Forex Trading :roboforex login Is An Automated Forex Investing Software. It Is An Algorithmic Trading Software That Provides Automated Forex Trading Signals.

    ReplyDelete
  131. Hello Mark, your job step is a life saver. However i was wondering if error handling could be enabled on the step level?The option is gereyed out and i have a tricky situation with file read race conditions
    Cheers, E

    ReplyDelete
  132. If you're interested in getting a job in Python, look at the advanced Python training offered at Hyderabad from AI Patasala. Python Training in Hyderabad by trained experts at AI Patasala training centre is the best option.
    Python Course Hyderabad

    ReplyDelete
  133. Take advantage of the AI Patasala's training program that focuses on "Python training at Hyderabad" and builds up your understanding of Python.
    Python Training Hyderabad

    ReplyDelete
  134. https://intellimindz.com/etl-testing-online-training/
    https://intellimindz.com/power-bi-online-training/
    https://intellimindz.com/sap-ehs-online-training/
    https://intellimindz.com/sap-erp-online-training/

    ReplyDelete
  135. Hi, I do think this is a great website. I stumble D upon it ?? I am going to return once again since i have book-marked it. Money and freedom is the greatest way to change, may you be rich and continue to help other people.

    BCom Exam Time Table
    BCom 1st Year Exam Time Table
    BCom 2nd Year Exam Time Table
    BCom 3rd Year Exam Time Table

    ReplyDelete
  136. Benevolently get me know whether youre looking through out an article creator on your page. you have several very enormous posts and I temper i'd be a pleasurable asset. in case you anytime throb to concede a portion of the load off, id essentially be on hearth proceeding with kind of to record several surface to your weblog in quarrel for an accomplice see to mine. Assuming no one minds, transport me an email whenever invigorated. thankful!. iPhone Data Recovery Software Full Version Free Download

    ReplyDelete
  137. Yet again a couple of times, you can defer and restart exercises; the result results can be defended and imported, similarly as at whatever point you restart, there is compelling reason need to check again. Easeus Data Recovery Wizard Free License Code

    ReplyDelete