Next Generation of pyClowder

Build: #23 failed

Job: Default Job failed

Stages & jobs

  1. Default Stage

Build log

The build generated 733 lines of output. Download or view full build log.

18-Jul-2017 13:18:21 Build Clowder - pyclowder2 - geostreams - Default Job #23 (CATS-PYC26-JOB1-23) started building on agent buildserver-2.os.ncsa.edu
18-Jul-2017 13:18:21 Remote agent on host buildserver-2.os.ncsa.edu
18-Jul-2017 13:18:21 Build working directory is /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC26-JOB1
18-Jul-2017 13:18:21 Executing build Clowder - pyclowder2 - geostreams - Default Job #23 (CATS-PYC26-JOB1-23)
18-Jul-2017 13:18:21 Starting task 'Checkout Default Repository' of type 'com.atlassian.bamboo.plugins.vcs:task.vcs.checkout'
18-Jul-2017 13:18:21 Updating source code to revision: 3705a7cdcb2ad236ea49479e1ee7c20ea32c1f21
18-Jul-2017 13:18:21 Fetching 'refs/heads/geostreams' from 'ssh://git@opensource.ncsa.illinois.edu:7999/cats/pyclowder2.git'. Will try to do a shallow fetch.
18-Jul-2017 13:18:21 Warning: Permanently added '[127.0.0.1]:39815' (RSA) to the list of known hosts.
18-Jul-2017 13:18:21 From ssh://127.0.0.1:39815/cats/pyclowder2
18-Jul-2017 13:18:21 + 47c8bbf...3705a7c geostreams -> geostreams  (forced update)
18-Jul-2017 13:18:21 Checking out revision 3705a7cdcb2ad236ea49479e1ee7c20ea32c1f21.
18-Jul-2017 13:18:21 Already on 'geostreams'
18-Jul-2017 13:18:21 Updated source code to revision: 3705a7cdcb2ad236ea49479e1ee7c20ea32c1f21
18-Jul-2017 13:18:21 Finished task 'Checkout Default Repository' with result: Success
18-Jul-2017 13:18:21 Running pre-build action: VCS Version Collector
18-Jul-2017 13:18:21 Starting task 'pytest' of type 'com.atlassian.bamboo.plugins.scripttask:task.builder.script'
18-Jul-2017 13:18:21
Beginning to execute external process for build 'Clowder - pyclowder2 - geostreams - Default Job #23 (CATS-PYC26-JOB1-23)'
... running command line:
/home/bamboo/bamboo-agent-home/temp/CATS-PYC26-JOB1-23-ScriptBuildTask-5964688587154337776.sh
... in: /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC26-JOB1
... using extra environment variables:
bamboo_planRepository_1_branch=geostreams
bamboo_capability_system_builder_node_Node_js_v6_9_1=/home/bamboo/node-v6.9.1/bin/node
bamboo_capability_system_builder_command_npm_6=/home/bamboo/node-v6.9.1/bin/npm
bamboo_capability_system_builder_command_buckminster_4_3=/home/bamboo/buckminster-4.3/buckminster
bamboo_capability_system_builder_command_buckminster_4_2=/home/bamboo/buckminster-4.2/buckminster
bamboo_planRepository_1_branchDisplayName=geostreams
bamboo_repository_revision_number=3705a7cdcb2ad236ea49479e1ee7c20ea32c1f21
bamboo_resultsUrl=https://opensource.ncsa.illinois.edu/bamboo/browse/CATS-PYC26-JOB1-23
bamboo_repository_127172662_previous_revision_number=47c8bbfc6ca5797d7a47aa69f2c411a6ce793aea
bamboo_capability_system_builder_command_sphinx=/usr/bin/sphinx-build
bamboo_planRepository_1_name=pyclowder2
bamboo_repository_127172662_branch_name=geostreams
bamboo_build_working_directory=/home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC26-JOB1
bamboo_buildKey=CATS-PYC26-JOB1
bamboo_capability_system_os=linux
bamboo_repository_127172662_git_branch=geostreams
bamboo_shortPlanName=geostreams
bamboo_repository_127172662_git_repositoryUrl=ssh://git@opensource.ncsa.illinois.edu:7999/cats/pyclowder2.git
bamboo_repository_127172662_revision_number=3705a7cdcb2ad236ea49479e1ee7c20ea32c1f21
bamboo_planRepository_name=pyclowder2
bamboo_buildNumber=23
bamboo_repository_127172662_name=pyclowder2
bamboo_shortJobName=Default Job
bamboo_buildResultsUrl=https://opensource.ncsa.illinois.edu/bamboo/browse/CATS-PYC26-JOB1-23
bamboo_planRepository_repositoryUrl=ssh://git@opensource.ncsa.illinois.edu:7999/cats/pyclowder2.git
bamboo_agentId=140673028
bamboo_planName=Clowder - pyclowder2 - geostreams
bamboo_shortPlanKey=PYC26
bamboo_capability_system_builder_command_sbt_0_13=/home/bamboo/sbt-0.13.2/bin/sbt
bamboo_shortJobKey=JOB1
bamboo_capability_system_builder_node_Node_js_v0_10_28=/home/bamboo/node-v0.10.28/bin/node
bamboo_planRepository_revision=3705a7cdcb2ad236ea49479e1ee7c20ea32c1f21
bamboo_repository_previous_revision_number=47c8bbfc6ca5797d7a47aa69f2c411a6ce793aea
bamboo_buildTimeStamp=2017-07-18T13:18:21.009-05:00
bamboo_capability_system_builder_command_npm=/home/bamboo/node-v0.10.28/bin/npm
bamboo_planRepository_previousRevision=47c8bbfc6ca5797d7a47aa69f2c411a6ce793aea
bamboo_capability_system_builder_mvn2_Maven_2=/home/bamboo/apache-maven-2.2.1
bamboo_buildResultKey=CATS-PYC26-JOB1-23
bamboo_repository_git_branch=geostreams
bamboo_repository_branch_name=geostreams
bamboo_buildPlanName=Clowder - pyclowder2 - geostreams - Default Job
bamboo_planRepository_1_revision=3705a7cdcb2ad236ea49479e1ee7c20ea32c1f21
bamboo_capability_system_builder_command_python3=/usr/bin/python3
bamboo_repository_name=pyclowder2
bamboo_repository_127172662_git_username=
bamboo_buildFailed=false
bamboo_capability_system_docker_executable=/usr/bin/docker
bamboo_capability_system_builder_command_grunt=/home/bamboo/node-v0.10.28/bin/grunt
bamboo_planRepository_branch=geostreams
bamboo_agentWorkingDirectory=/home/bamboo/bamboo-agent-home/xml-data/build-dir
bamboo_capability_system_git_executable=/usr/bin/git
bamboo_planRepository_1_previousRevision=47c8bbfc6ca5797d7a47aa69f2c411a6ce793aea
bamboo_repository_git_username=
bamboo_capability_system_builder_sbt_SBT_0_13_13=/home/bamboo/sbt-0.13.13
bamboo_planRepository_branchDisplayName=geostreams
bamboo_capability_system_builder_command_phantomjs=/home/bamboo/phantomjs-1.9.8/bin/phantomjs
bamboo_planRepository_1_type=bbserver
bamboo_planRepository_branchName=geostreams
bamboo_capability_system_builder_command_python2_7=/usr/bin/python2.7
bamboo_capability_system_hostname=buildserver-1
bamboo_capability_system_jdk_JDK=/home/bamboo/jdk1.8.0_66
bamboo_capability_system_software_mongo=/usr/bin/mongo
bamboo_plan_storageTag=plan-126976103
bamboo_capability_system_software_rabbitmq=/usr/sbin/rabbitmqctl
bamboo_capability_system_builder_command_casperjs=/home/bamboo/node-v0.10.28/bin/casperjs
bamboo_planRepository_type=bbserver
bamboo_planRepository_1_username=
bamboo_capability_system_jdk_JDK_1_8_0_66=/home/bamboo/jdk1.8.0_66
bamboo_repository_git_repositoryUrl=ssh://git@opensource.ncsa.illinois.edu:7999/cats/pyclowder2.git
bamboo_capability_system_builder_node_Node_js=/home/bamboo/node-v0.10.28/bin/node
bamboo_capability_system_builder_ant_Ant=/home/bamboo/apache-ant-1.9.4
bamboo_capability_system_builder_mvn3_Maven_3=/home/bamboo/apache-maven-3.3.9
bamboo_working_directory=/home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC26-JOB1
bamboo_planKey=CATS-PYC26
bamboo_planRepository_1_repositoryUrl=ssh://git@opensource.ncsa.illinois.edu:7999/cats/pyclowder2.git
bamboo_planRepository_username=
bamboo_capability_system_jdk_JDK_1_8=/home/bamboo/jdk1.8.0_66
bamboo_capability_system_jdk_JDK_1_6=/home/bamboo/jdk1.6.0_41
bamboo_capability_system_builder_command_mkcrx=/home/bamboo/mkcrx/mkcrx.sh
bamboo_capability_system_jdk_JDK_1_7=/home/bamboo/jdk1.7.0_60
bamboo_planRepository_1_branchName=geostreams
18-Jul-2017 13:18:21 New python executable in /tmp/virtualenv/pyclowder2/bin/python2
18-Jul-2017 13:18:21 Also creating executable in /tmp/virtualenv/pyclowder2/bin/python
18-Jul-2017 13:18:24 Installing setuptools, pkg_resources, pip, wheel...done.
18-Jul-2017 13:18:24 Running virtualenv with interpreter /usr/bin/python2
18-Jul-2017 13:18:24 Collecting enum34==1.1.6 (from -r requirements.txt (line 1))
18-Jul-2017 13:18:24   Using cached enum34-1.1.6-py2-none-any.whl
18-Jul-2017 13:18:24 Collecting et-xmlfile==1.0.1 (from -r requirements.txt (line 2))
18-Jul-2017 13:18:24 Collecting jdcal==1.3 (from -r requirements.txt (line 3))
18-Jul-2017 13:18:24 Collecting openpyxl==2.4.1 (from -r requirements.txt (line 4))
18-Jul-2017 13:18:24 Collecting pika==0.10.0 (from -r requirements.txt (line 5))
18-Jul-2017 13:18:24   Using cached pika-0.10.0-py2.py3-none-any.whl
18-Jul-2017 13:18:25 Collecting python-dateutil==2.6.0 (from -r requirements.txt (line 6))
18-Jul-2017 13:18:25   Using cached python_dateutil-2.6.0-py2.py3-none-any.whl
18-Jul-2017 13:18:25 Collecting pytz==2016.10 (from -r requirements.txt (line 7))
18-Jul-2017 13:18:25   Using cached pytz-2016.10-py2.py3-none-any.whl
18-Jul-2017 13:18:25 Collecting PyYAML==3.11 (from -r requirements.txt (line 8))
18-Jul-2017 13:18:25 Collecting requests==2.10.0 (from -r requirements.txt (line 9))
18-Jul-2017 13:18:25   Using cached requests-2.10.0-py2.py3-none-any.whl
18-Jul-2017 13:18:25 Collecting six==1.10.0 (from -r requirements.txt (line 10))
18-Jul-2017 13:18:25   Using cached six-1.10.0-py2.py3-none-any.whl
18-Jul-2017 13:18:25 Collecting wheel==0.24.0 (from -r requirements.txt (line 11))
18-Jul-2017 13:18:25   Using cached wheel-0.24.0-py2.py3-none-any.whl
18-Jul-2017 13:18:25 Collecting pytest==3.0.3 (from -r requirements.txt (line 12))
18-Jul-2017 13:18:25   Using cached pytest-3.0.3-py2.py3-none-any.whl
18-Jul-2017 13:18:25 Collecting pytest-pep8==1.0.6 (from -r requirements.txt (line 13))
18-Jul-2017 13:18:25 Collecting pytest-capturelog==0.7 (from -r requirements.txt (line 14))
18-Jul-2017 13:18:25   Downloading pytest-capturelog-0.7.tar.gz
18-Jul-2017 13:18:26 Collecting urllib3==1.14 (from -r requirements.txt (line 15))
18-Jul-2017 13:18:26   Using cached urllib3-1.14-py2.py3-none-any.whl
18-Jul-2017 13:18:26 Collecting py>=1.4.29 (from pytest==3.0.3->-r requirements.txt (line 12))
18-Jul-2017 13:18:26   Using cached py-1.4.34-py2.py3-none-any.whl
18-Jul-2017 13:18:26 Collecting pep8>=1.3 (from pytest-pep8==1.0.6->-r requirements.txt (line 13))
18-Jul-2017 13:18:26   Using cached pep8-1.7.0-py2.py3-none-any.whl
18-Jul-2017 13:18:26 Collecting pytest-cache (from pytest-pep8==1.0.6->-r requirements.txt (line 13))
18-Jul-2017 13:18:26 Collecting execnet>=1.1.dev1 (from pytest-cache->pytest-pep8==1.0.6->-r requirements.txt (line 13))
18-Jul-2017 13:18:26   Using cached execnet-1.4.1-py2.py3-none-any.whl
18-Jul-2017 13:18:26 Collecting apipkg>=1.4 (from execnet>=1.1.dev1->pytest-cache->pytest-pep8==1.0.6->-r requirements.txt (line 13))
18-Jul-2017 13:18:26   Using cached apipkg-1.4-py2.py3-none-any.whl
18-Jul-2017 13:18:26 Building wheels for collected packages: pytest-capturelog
18-Jul-2017 13:18:26   Running setup.py bdist_wheel for pytest-capturelog: started
18-Jul-2017 13:18:27   Running setup.py bdist_wheel for pytest-capturelog: finished with status 'done'
18-Jul-2017 13:18:27   Stored in directory: /home/bamboo/.cache/pip/wheels/38/6b/95/818d4ca4ee7f217b11b5fdad918f47dd2c2bc768ce32dc6001
18-Jul-2017 13:18:27 Successfully built pytest-capturelog
18-Jul-2017 13:18:27 Installing collected packages: enum34, et-xmlfile, jdcal, openpyxl, pika, six, python-dateutil, pytz, PyYAML, requests, wheel, py, pytest, pep8, apipkg, execnet, pytest-cache, pytest-pep8, pytest-capturelog, urllib3
18-Jul-2017 13:18:28   Found existing installation: wheel 0.30.0a0
18-Jul-2017 13:18:28     Uninstalling wheel-0.30.0a0:
18-Jul-2017 13:18:28       Successfully uninstalled wheel-0.30.0a0
18-Jul-2017 13:18:28 Successfully installed PyYAML-3.11 apipkg-1.4 enum34-1.1.6 et-xmlfile-1.0.1 execnet-1.4.1 jdcal-1.3 openpyxl-2.4.1 pep8-1.7.0 pika-0.10.0 py-1.4.34 pytest-3.0.3 pytest-cache-1.0 pytest-capturelog-0.7 pytest-pep8-1.0.6 python-dateutil-2.6.0 pytz-2016.10 requests-2.10.0 six-1.10.0 urllib3-1.14 wheel-0.24.0
18-Jul-2017 13:18:28 ============================= test session starts ==============================
18-Jul-2017 13:18:28 platform linux2 -- Python 2.7.12, pytest-3.0.3, py-1.4.34, pluggy-0.4.0
18-Jul-2017 13:18:28 rootdir: /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC26-JOB1, inifile: setup.cfg
18-Jul-2017 13:18:28 plugins: pep8-1.0.6, capturelog-0.7
18-Jul-2017 13:18:28 collected 40 items
18-Jul-2017 13:18:29
18-Jul-2017 13:18:29 setup.py .
18-Jul-2017 13:18:29 docs/source/conf.py F
18-Jul-2017 13:18:29 pyclowder/__init__.py .
18-Jul-2017 13:18:29 pyclowder/client.py F
18-Jul-2017 13:18:29 pyclowder/collections.py .
18-Jul-2017 13:18:29 pyclowder/connectors.py .
18-Jul-2017 13:18:29 pyclowder/datasets.py .
18-Jul-2017 13:18:29 pyclowder/extractors.py .
18-Jul-2017 13:18:29 pyclowder/files.py .
18-Jul-2017 13:18:29 pyclowder/sections.py .
18-Jul-2017 13:18:29 pyclowder/utils.py .
18-Jul-2017 13:18:29 pyclowder/geostreams/__init__.py .
18-Jul-2017 13:18:29 pyclowder/geostreams/cache.py .
18-Jul-2017 13:18:29 pyclowder/geostreams/csv.py .
18-Jul-2017 13:18:29 pyclowder/geostreams/datapoints.py .
18-Jul-2017 13:18:29 pyclowder/geostreams/datasets.py .
18-Jul-2017 13:18:29 pyclowder/geostreams/geocode-convert.py F
18-Jul-2017 13:18:29 pyclowder/geostreams/geocode_convert.py F
18-Jul-2017 13:18:29 pyclowder/geostreams/map_names.py .
18-Jul-2017 13:18:29 pyclowder/geostreams/sensors.py F
18-Jul-2017 13:18:29 pyclowder/geostreams/streams.py .
18-Jul-2017 13:18:29 pyclowder/geostreams/time_transformers.py .
18-Jul-2017 13:18:29 sample-extractors/echo/echo.py .
18-Jul-2017 13:18:29 sample-extractors/wordcount/wordcount.py .
18-Jul-2017 13:18:29 tests/__init__.py .
18-Jul-2017 13:18:29 tests/conftest.py F
18-Jul-2017 13:18:29 tests/test_datapoints.py FF
18-Jul-2017 13:18:29 tests/test_geostreams.py .FFF
18-Jul-2017 13:18:29 tests/test_sensors.py .FFF
18-Jul-2017 13:18:29 tests/test_streams.py FFFF
18-Jul-2017 13:18:29
18-Jul-2017 13:18:29 generated xml file: /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC26-JOB1/test-reports/results.xml
18-Jul-2017 13:18:29 =================================== FAILURES ===================================
18-Jul-2017 13:18:29 __________________________________ PEP8-check __________________________________
18-Jul-2017 13:18:29 /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC26-JOB1/docs/source/conf.py:34:5: E128 continuation line under-indented for visual indent
18-Jul-2017 13:18:29     'sphinx.ext.doctest',
18-Jul-2017 13:18:29     ^
18-Jul-2017 13:18:29 /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC26-JOB1/docs/source/conf.py:35:5: E128 continuation line under-indented for visual indent
18-Jul-2017 13:18:29     'sphinx.ext.viewcode',
18-Jul-2017 13:18:29     ^
18-Jul-2017 13:18:29 /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC26-JOB1/docs/source/conf.py:36:5: E128 continuation line under-indented for visual indent
18-Jul-2017 13:18:29     'sphinx.ext.githubpages']
18-Jul-2017 13:18:29     ^
18-Jul-2017 13:18:29 /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC26-JOB1/docs/source/conf.py:159:1: W391 blank line at end of file
18-Jul-2017 13:18:29
18-Jul-2017 13:18:29 ^
18-Jul-2017 13:18:29
18-Jul-2017 13:18:29 __________________________________ PEP8-check __________________________________
18-Jul-2017 13:18:29 /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC26-JOB1/pyclowder/client.py:145:121: E501 line too long (137 > 120 characters)
18-Jul-2017 13:18:29             return requests.post(url, params=params, data=json.dumps(content), headers=self.headers, auth=(self.username, self.password))
18-Jul-2017 13:18:29                                                                                                                         ^
18-Jul-2017 13:18:29 /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC26-JOB1/pyclowder/client.py:163:121: E501 line too long (127 > 120 characters)
18-Jul-2017 13:18:29             return requests.post(url, params=params, files={"File": open(filename, 'rb')}, auth=(self.username, self.password))
18-Jul-2017 13:18:29                                                                                                                         ^
18-Jul-2017 13:18:29
18-Jul-2017 13:18:29 __________________________________ PEP8-check __________________________________
18-Jul-2017 13:18:29 /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC26-JOB1/pyclowder/geostreams/geocode-convert.py:5:1: E302 expected 2 blank lines, found 1
18-Jul-2017 13:18:29 def dms2dec(dms):
18-Jul-2017 13:18:29 ^
18-Jul-2017 13:18:29 /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC26-JOB1/pyclowder/geostreams/geocode-convert.py:8:33: E251 unexpected spaces around keyword / parameter equals
18-Jul-2017 13:18:29     dec = LatLon(Latitude(degree = dms[0][0], minute = dms[0][1], second = dms[0][2]), Longitude(degree = dms[1][0], minute = dms[1][1], second = dms[1][2]))
18-Jul-2017 13:18:29                                 ^
18-Jul-2017 13:18:29 /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC26-JOB1/pyclowder/geostreams/geocode-convert.py:8:35: E251 unexpected spaces around keyword / parameter equals
18-Jul-2017 13:18:29     dec = LatLon(Latitude(degree = dms[0][0], minute = dms[0][1], second = dms[0][2]), Longitude(degree = dms[1][0], minute = dms[1][1], second = dms[1][2]))
18-Jul-2017 13:18:29                                   ^
18-Jul-2017 13:18:29 /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC26-JOB1/pyclowder/geostreams/geocode-convert.py:8:53: E251 unexpected spaces around keyword / parameter equals
18-Jul-2017 13:18:29     dec = LatLon(Latitude(degree = dms[0][0], minute = dms[0][1], second = dms[0][2]), Longitude(degree = dms[1][0], minute = dms[1][1], second = dms[1][2]))
18-Jul-2017 13:18:29                                                     ^
18-Jul-2017 13:18:29 /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC26-JOB1/pyclowder/geostreams/geocode-convert.py:8:55: E251 unexpected spaces around keyword / parameter equals
18-Jul-2017 13:18:29     dec = LatLon(Latitude(degree = dms[0][0], minute = dms[0][1], second = dms[0][2]), Longitude(degree = dms[1][0], minute = dms[1][1], second = dms[1][2]))
18-Jul-2017 13:18:29                                                       ^
18-Jul-2017 13:18:29 /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC26-JOB1/pyclowder/geostreams/geocode-convert.py:8:73: E251 unexpected spaces around keyword / parameter equals
18-Jul-2017 13:18:29     dec = LatLon(Latitude(degree = dms[0][0], minute = dms[0][1], second = dms[0][2]), Longitude(degree = dms[1][0], minute = dms[1][1], second = dms[1][2]))
18-Jul-2017 13:18:29                                                                         ^
18-Jul-2017 13:18:29 /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC26-JOB1/pyclowder/geostreams/geocode-convert.py:8:75: E251 unexpected spaces around keyword / parameter equals
18-Jul-2017 13:18:29     dec = LatLon(Latitude(degree = dms[0][0], minute = dms[0][1], second = dms[0][2]), Longitude(degree = dms[1][0], minute = dms[1][1], second = dms[1][2]))
18-Jul-2017 13:18:29                                                                           ^
18-Jul-2017 13:18:29 /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC26-JOB1/pyclowder/geostreams/geocode-convert.py:8:104: E251 unexpected spaces around keyword / parameter equals
18-Jul-2017 13:18:29     dec = LatLon(Latitude(degree = dms[0][0], minute = dms[0][1], second = dms[0][2]), Longitude(degree = dms[1][0], minute = dms[1][1], second = dms[1][2]))
18-Jul-2017 13:18:29                                                                                                        ^
18-Jul-2017 13:18:29 /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC26-JOB1/pyclowder/geostreams/geocode-convert.py:8:106: E251 unexpected spaces around keyword / parameter equals
18-Jul-2017 13:18:29     dec = LatLon(Latitude(degree = dms[0][0], minute = dms[0][1], second = dms[0][2]), Longitude(degree = dms[1][0], minute = dms[1][1], second = dms[1][2]))
18-Jul-2017 13:18:29                                                                                                          ^
18-Jul-2017 13:18:29 /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC26-JOB1/pyclowder/geostreams/geocode-convert.py:8:121: E501 line too long (157 > 120 characters)
18-Jul-2017 13:18:29     dec = LatLon(Latitude(degree = dms[0][0], minute = dms[0][1], second = dms[0][2]), Longitude(degree = dms[1][0], minute = dms[1][1], second = dms[1][2]))
18-Jul-2017 13:18:29                                                                                                                         ^
18-Jul-2017 13:18:29 /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC26-JOB1/pyclowder/geostreams/geocode-convert.py:8:124: E251 unexpected spaces around keyword / parameter equals
18-Jul-2017 13:18:29     dec = LatLon(Latitude(degree = dms[0][0], minute = dms[0][1], second = dms[0][2]), Longitude(degree = dms[1][0], minute = dms[1][1], second = dms[1][2]))
18-Jul-2017 13:18:29                                                                                                                            ^
18-Jul-2017 13:18:29 /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC26-JOB1/pyclowder/geostreams/geocode-convert.py:8:126: E251 unexpected spaces around keyword / parameter equals
18-Jul-2017 13:18:29     dec = LatLon(Latitude(degree = dms[0][0], minute = dms[0][1], second = dms[0][2]), Longitude(degree = dms[1][0], minute = dms[1][1], second = dms[1][2]))
18-Jul-2017 13:18:29                                                                                                                              ^
18-Jul-2017 13:18:29 /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC26-JOB1/pyclowder/geostreams/geocode-convert.py:8:144: E251 unexpected spaces around keyword / parameter equals
18-Jul-2017 13:18:29     dec = LatLon(Latitude(degree = dms[0][0], minute = dms[0][1], second = dms[0][2]), Longitude(degree = dms[1][0], minute = dms[1][1], second = dms[1][2]))
18-Jul-2017 13:18:29                                                                                                                                                ^
18-Jul-2017 13:18:29 /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC26-JOB1/pyclowder/geostreams/geocode-convert.py:8:146: E251 unexpected spaces around keyword / parameter equals
18-Jul-2017 13:18:29     dec = LatLon(Latitude(degree = dms[0][0], minute = dms[0][1], second = dms[0][2]), Longitude(degree = dms[1][0], minute = dms[1][1], second = dms[1][2]))
18-Jul-2017 13:18:29                                                                                                                                                  ^
18-Jul-2017 13:18:29 /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC26-JOB1/pyclowder/geostreams/geocode-convert.py:10:15: W292 no newline at end of file
18-Jul-2017 13:18:29     return dec
18-Jul-2017 13:18:29               ^
18-Jul-2017 13:18:29
18-Jul-2017 13:18:29 __________________________________ PEP8-check __________________________________
18-Jul-2017 13:18:29 /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC26-JOB1/pyclowder/geostreams/geocode_convert.py:6:1: E302 expected 2 blank lines, found 0
18-Jul-2017 13:18:29 def dms2dec(dms):
18-Jul-2017 13:18:29 ^
18-Jul-2017 13:18:29 /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC26-JOB1/pyclowder/geostreams/geocode_convert.py:7:38: E251 unexpected spaces around keyword / parameter equals
18-Jul-2017 13:18:29     dec = list(LatLon(Latitude(degree = dms[0][0], minute = dms[0][1], second = dms[0][2]), Longitude(degree = dms[1][0], minute = dms[1][1], second = dms[1][2])).to_string())
18-Jul-2017 13:18:29                                      ^
18-Jul-2017 13:18:29 /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC26-JOB1/pyclowder/geostreams/geocode_convert.py:7:40: E251 unexpected spaces around keyword / parameter equals
18-Jul-2017 13:18:29     dec = list(LatLon(Latitude(degree = dms[0][0], minute = dms[0][1], second = dms[0][2]), Longitude(degree = dms[1][0], minute = dms[1][1], second = dms[1][2])).to_string())
18-Jul-2017 13:18:29                                        ^
18-Jul-2017 13:18:29 /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC26-JOB1/pyclowder/geostreams/geocode_convert.py:7:58: E251 unexpected spaces around keyword / parameter equals
18-Jul-2017 13:18:29     dec = list(LatLon(Latitude(degree = dms[0][0], minute = dms[0][1], second = dms[0][2]), Longitude(degree = dms[1][0], minute = dms[1][1], second = dms[1][2])).to_string())
18-Jul-2017 13:18:29                                                          ^
18-Jul-2017 13:18:29 /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC26-JOB1/pyclowder/geostreams/geocode_convert.py:7:60: E251 unexpected spaces around keyword / parameter equals
18-Jul-2017 13:18:29     dec = list(LatLon(Latitude(degree = dms[0][0], minute = dms[0][1], second = dms[0][2]), Longitude(degree = dms[1][0], minute = dms[1][1], second = dms[1][2])).to_string())
18-Jul-2017 13:18:29                                                            ^
18-Jul-2017 13:18:29 /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC26-JOB1/pyclowder/geostreams/geocode_convert.py:7:78: E251 unexpected spaces around keyword / parameter equals
18-Jul-2017 13:18:29     dec = list(LatLon(Latitude(degree = dms[0][0], minute = dms[0][1], second = dms[0][2]), Longitude(degree = dms[1][0], minute = dms[1][1], second = dms[1][2])).to_string())
18-Jul-2017 13:18:29                                                                              ^
18-Jul-2017 13:18:29 /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC26-JOB1/pyclowder/geostreams/geocode_convert.py:7:80: E251 unexpected spaces around keyword / parameter equals
18-Jul-2017 13:18:29     dec = list(LatLon(Latitude(degree = dms[0][0], minute = dms[0][1], second = dms[0][2]), Longitude(degree = dms[1][0], minute = dms[1][1], second = dms[1][2])).to_string())
18-Jul-2017 13:18:29                                                                                ^
18-Jul-2017 13:18:29 /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC26-JOB1/pyclowder/geostreams/geocode_convert.py:7:109: E251 unexpected spaces around keyword / parameter equals
18-Jul-2017 13:18:29     dec = list(LatLon(Latitude(degree = dms[0][0], minute = dms[0][1], second = dms[0][2]), Longitude(degree = dms[1][0], minute = dms[1][1], second = dms[1][2])).to_string())
18-Jul-2017 13:18:29                                                                                                             ^
18-Jul-2017 13:18:29 /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC26-JOB1/pyclowder/geostreams/geocode_convert.py:7:111: E251 unexpected spaces around keyword / parameter equals
18-Jul-2017 13:18:29     dec = list(LatLon(Latitude(degree = dms[0][0], minute = dms[0][1], second = dms[0][2]), Longitude(degree = dms[1][0], minute = dms[1][1], second = dms[1][2])).to_string())
18-Jul-2017 13:18:29                                                                                                               ^
18-Jul-2017 13:18:29 /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC26-JOB1/pyclowder/geostreams/geocode_convert.py:7:121: E501 line too long (175 > 120 characters)
18-Jul-2017 13:18:29     dec = list(LatLon(Latitude(degree = dms[0][0], minute = dms[0][1], second = dms[0][2]), Longitude(degree = dms[1][0], minute = dms[1][1], second = dms[1][2])).to_string())
18-Jul-2017 13:18:29                                                                                                                         ^
18-Jul-2017 13:18:29 /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC26-JOB1/pyclowder/geostreams/geocode_convert.py:7:129: E251 unexpected spaces around keyword / parameter equals
18-Jul-2017 13:18:29     dec = list(LatLon(Latitude(degree = dms[0][0], minute = dms[0][1], second = dms[0][2]), Longitude(degree = dms[1][0], minute = dms[1][1], second = dms[1][2])).to_string())
18-Jul-2017 13:18:29                                                                                                                                 ^
18-Jul-2017 13:18:29 /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC26-JOB1/pyclowder/geostreams/geocode_convert.py:7:131: E251 unexpected spaces around keyword / parameter equals
18-Jul-2017 13:18:29     dec = list(LatLon(Latitude(degree = dms[0][0], minute = dms[0][1], second = dms[0][2]), Longitude(degree = dms[1][0], minute = dms[1][1], second = dms[1][2])).to_string())
18-Jul-2017 13:18:29                                                                                                                                   ^
18-Jul-2017 13:18:29 /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC26-JOB1/pyclowder/geostreams/geocode_convert.py:7:149: E251 unexpected spaces around keyword / parameter equals
18-Jul-2017 13:18:29     dec = list(LatLon(Latitude(degree = dms[0][0], minute = dms[0][1], second = dms[0][2]), Longitude(degree = dms[1][0], minute = dms[1][1], second = dms[1][2])).to_string())
18-Jul-2017 13:18:29                                                                                                                                                     ^
18-Jul-2017 13:18:29 /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC26-JOB1/pyclowder/geostreams/geocode_convert.py:7:151: E251 unexpected spaces around keyword / parameter equals
18-Jul-2017 13:18:29     dec = list(LatLon(Latitude(degree = dms[0][0], minute = dms[0][1], second = dms[0][2]), Longitude(degree = dms[1][0], minute = dms[1][1], second = dms[1][2])).to_string())
18-Jul-2017 13:18:29                                                                                                                                                       ^
18-Jul-2017 13:18:29 /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC26-JOB1/pyclowder/geostreams/geocode_convert.py:9:26: E231 missing whitespace after ','
18-Jul-2017 13:18:29     return [float(dec[0]),float(dec[1])]
18-Jul-2017 13:18:29                          ^
18-Jul-2017 13:18:29
18-Jul-2017 13:18:29 __________________________________ PEP8-check __________________________________
18-Jul-2017 13:18:29 /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC26-JOB1/pyclowder/geostreams/sensors.py:69:12: E111 indentation is not a multiple of four
18-Jul-2017 13:18:29            return self.client.post("/geostreams/sensors", sensor)
18-Jul-2017 13:18:29            ^
18-Jul-2017 13:18:29
18-Jul-2017 13:18:29 __________________________________ PEP8-check __________________________________
18-Jul-2017 13:18:29 /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC26-JOB1/tests/conftest.py:14:1: E302 expected 2 blank lines, found 1
18-Jul-2017 13:18:29 @pytest.fixture(scope="module")
18-Jul-2017 13:18:29 ^
18-Jul-2017 13:18:29
18-Jul-2017 13:18:29 __________________________________ PEP8-check __________________________________
18-Jul-2017 13:18:29 /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC26-JOB1/tests/test_datapoints.py:11:39: W292 no newline at end of file
18-Jul-2017 13:18:29     assert response.status_code != 200
18-Jul-2017 13:18:29                                       ^
18-Jul-2017 13:18:29
18-Jul-2017 13:18:29 _____________________ test_datapoints_count_by_sensor_get ______________________
18-Jul-2017 13:18:29
18-Jul-2017 13:18:29 caplog = <pytest_capturelog.CaptureLogFuncArg object at 0x7fbbe8c069d0>
18-Jul-2017 13:18:29 host = 'http://localhost:9000/clowder', key = 'r1ek3rs'
18-Jul-2017 13:18:29
18-Jul-2017 13:18:29     def test_datapoints_count_by_sensor_get(caplog, host, key):
18-Jul-2017 13:18:29         caplog.setLevel(logging.DEBUG)
18-Jul-2017 13:18:29         client = DatapointsApi(host=host, key=key)
18-Jul-2017 13:18:29 >       response = client.datapoints_count_by_sensor_get(950)
18-Jul-2017 13:18:29
18-Jul-2017 13:18:29 tests/test_datapoints.py:8:
18-Jul-2017 13:18:29 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
18-Jul-2017 13:18:29
18-Jul-2017 13:18:29 self = <pyclowder.geostreams.datapoints.DatapointsApi object at 0x7fbbe8c06b90>
18-Jul-2017 13:18:29 sensor_id = 950
18-Jul-2017 13:18:29
18-Jul-2017 13:18:29     def datapoints_count_by_sensor_get(self, sensor_id):
18-Jul-2017 13:18:29         """
18-Jul-2017 13:18:29             Get the list of all available sensors.
18-Jul-2017 13:18:29    
18-Jul-2017 13:18:29             :return: Full list of sensors.
18-Jul-2017 13:18:29             :rtype: `requests.Response`
18-Jul-2017 13:18:29             """
18-Jul-2017 13:18:29         logging.debug("Counting datapoints by sensor")
18-Jul-2017 13:18:29         try:
18-Jul-2017 13:18:29             return self.client.get("/geostreams/datapoints?sensor_id=%s&onlyCount=true" % sensor_id)
18-Jul-2017 13:18:29         except Exception as e:
18-Jul-2017 13:18:29 >           logging.error("Error counting datapoints by sensor %s: %s" % sensor_id, e.message)
18-Jul-2017 13:18:29 E           TypeError: not enough arguments for format string
18-Jul-2017 13:18:29
18-Jul-2017 13:18:29 pyclowder/geostreams/datapoints.py:45: TypeError
18-Jul-2017 13:18:29 --------------------------------- Captured log ---------------------------------
18-Jul-2017 13:18:29 datapoints.py               41 DEBUG    Counting datapoints by sensor
18-Jul-2017 13:18:29 connectionpool.py          213 INFO     Starting new HTTP connection (1): localhost
18-Jul-2017 13:18:29 _________________________________ test_version _________________________________
18-Jul-2017 13:18:29
18-Jul-2017 13:18:29 caplog = <pytest_capturelog.CaptureLogFuncArg object at 0x7fbbe8ba89d0>
18-Jul-2017 13:18:29 host = 'http://localhost:9000/clowder', key = 'r1ek3rs'
18-Jul-2017 13:18:29
18-Jul-2017 13:18:29     def test_version(caplog, host, key):
18-Jul-2017 13:18:29         caplog.setLevel(logging.DEBUG)
18-Jul-2017 13:18:29         client = ClowderClient(host=host, key=key)
18-Jul-2017 13:18:29 >       version = client.version()
18-Jul-2017 13:18:29
18-Jul-2017 13:18:29 tests/test_geostreams.py:10:
18-Jul-2017 13:18:29 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
18-Jul-2017 13:18:29 pyclowder/client.py:46: in version
18-Jul-2017 13:18:29     r = requests.get(url)
18-Jul-2017 13:18:29 /tmp/virtualenv/pyclowder2/local/lib/python2.7/site-packages/requests/api.py:71: in get
18-Jul-2017 13:18:29     return request('get', url, params=params, **kwargs)
18-Jul-2017 13:18:29 /tmp/virtualenv/pyclowder2/local/lib/python2.7/site-packages/requests/api.py:57: in request
18-Jul-2017 13:18:29     return session.request(method=method, url=url, **kwargs)
18-Jul-2017 13:18:29 /tmp/virtualenv/pyclowder2/local/lib/python2.7/site-packages/requests/sessions.py:475: in request
18-Jul-2017 13:18:29     resp = self.send(prep, **send_kwargs)
18-Jul-2017 13:18:29 /tmp/virtualenv/pyclowder2/local/lib/python2.7/site-packages/requests/sessions.py:585: in send
18-Jul-2017 13:18:29     r = adapter.send(request, **kwargs)
18-Jul-2017 13:18:29 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
18-Jul-2017 13:18:29
18-Jul-2017 13:18:29 self = <requests.adapters.HTTPAdapter object at 0x7fbbe8c06a50>
18-Jul-2017 13:18:29 request = <PreparedRequest [GET]>, stream = False
18-Jul-2017 13:18:29 timeout = <requests.packages.urllib3.util.timeout.Timeout object at 0x7fbbe8bcb590>
18-Jul-2017 13:18:29 verify = True, cert = None, proxies = OrderedDict()
18-Jul-2017 13:18:29
18-Jul-2017 13:18:29     def send(self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None):
18-Jul-2017 13:18:29         """Sends PreparedRequest object. Returns Response object.
18-Jul-2017 13:18:29    
18-Jul-2017 13:18:29             :param request: The :class:`PreparedRequest <PreparedRequest>` being sent.
18-Jul-2017 13:18:29             :param stream: (optional) Whether to stream the request content.
18-Jul-2017 13:18:29             :param timeout: (optional) How long to wait for the server to send
18-Jul-2017 13:18:29                 data before giving up, as a float, or a :ref:`(connect timeout,
18-Jul-2017 13:18:29                 read timeout) <timeouts>` tuple.
18-Jul-2017 13:18:29             :type timeout: float or tuple
18-Jul-2017 13:18:29             :param verify: (optional) Whether to verify SSL certificates.
18-Jul-2017 13:18:29             :param cert: (optional) Any user-provided SSL certificate to be trusted.
18-Jul-2017 13:18:29             :param proxies: (optional) The proxies dictionary to apply to the request.
18-Jul-2017 13:18:29             """
18-Jul-2017 13:18:29    
18-Jul-2017 13:18:29         conn = self.get_connection(request.url, proxies)
18-Jul-2017 13:18:29    
18-Jul-2017 13:18:29         self.cert_verify(conn, request.url, verify, cert)
18-Jul-2017 13:18:29         url = self.request_url(request, proxies)
18-Jul-2017 13:18:29         self.add_headers(request)
18-Jul-2017 13:18:29    
18-Jul-2017 13:18:29         chunked = not (request.body is None or 'Content-Length' in request.headers)
18-Jul-2017 13:18:29    
18-Jul-2017 13:18:29         if isinstance(timeout, tuple):
18-Jul-2017 13:18:29             try:
18-Jul-2017 13:18:29                 connect, read = timeout
18-Jul-2017 13:18:29                 timeout = TimeoutSauce(connect=connect, read=read)
18-Jul-2017 13:18:29             except ValueError as e:
18-Jul-2017 13:18:29                 # this may raise a string formatting error.
18-Jul-2017 13:18:29                 err = ("Invalid timeout {0}. Pass a (connect, read) "
18-Jul-2017 13:18:29                        "timeout tuple, or a single float to set "
18-Jul-2017 13:18:29                        "both timeouts to the same value".format(timeout))
18-Jul-2017 13:18:29                 raise ValueError(err)
18-Jul-2017 13:18:29         else:
18-Jul-2017 13:18:29             timeout = TimeoutSauce(connect=timeout, read=timeout)
18-Jul-2017 13:18:29    
18-Jul-2017 13:18:29         try:
18-Jul-2017 13:18:29             if not chunked:
18-Jul-2017 13:18:29                 resp = conn.urlopen(
18-Jul-2017 13:18:29                     method=request.method,
18-Jul-2017 13:18:29                     url=url,
18-Jul-2017 13:18:29                     body=request.body,
18-Jul-2017 13:18:29                     headers=request.headers,
18-Jul-2017 13:18:29                     redirect=False,
18-Jul-2017 13:18:29                     assert_same_host=False,
18-Jul-2017 13:18:29                     preload_content=False,
18-Jul-2017 13:18:29                     decode_content=False,
18-Jul-2017 13:18:29                     retries=self.max_retries,
18-Jul-2017 13:18:29                     timeout=timeout
18-Jul-2017 13:18:29                 )
18-Jul-2017 13:18:29    
18-Jul-2017 13:18:29             # Send the request.
18-Jul-2017 13:18:29             else:
18-Jul-2017 13:18:29                 if hasattr(conn, 'proxy_pool'):
18-Jul-2017 13:18:29                     conn = conn.proxy_pool
18-Jul-2017 13:18:29    
18-Jul-2017 13:18:29                 low_conn = conn._get_conn(timeout=DEFAULT_POOL_TIMEOUT)
18-Jul-2017 13:18:29    
18-Jul-2017 13:18:29                 try:
18-Jul-2017 13:18:29                     low_conn.putrequest(request.method,
18-Jul-2017 13:18:29                                         url,
18-Jul-2017 13:18:29                                         skip_accept_encoding=True)
18-Jul-2017 13:18:29    
18-Jul-2017 13:18:29                     for header, value in request.headers.items():
18-Jul-2017 13:18:29                         low_conn.putheader(header, value)
18-Jul-2017 13:18:29    
18-Jul-2017 13:18:29                     low_conn.endheaders()
18-Jul-2017 13:18:29    
18-Jul-2017 13:18:29                     for i in request.body:
18-Jul-2017 13:18:29                         low_conn.send(hex(len(i))[2:].encode('utf-8'))
18-Jul-2017 13:18:29                         low_conn.send(b'\r\n')
18-Jul-2017 13:18:29                         low_conn.send(i)
18-Jul-2017 13:18:29                         low_conn.send(b'\r\n')
18-Jul-2017 13:18:29                     low_conn.send(b'0\r\n\r\n')
18-Jul-2017 13:18:29    
18-Jul-2017 13:18:29                     # Receive the response from the server
18-Jul-2017 13:18:29                     try:
18-Jul-2017 13:18:29                         # For Python 2.7+ versions, use buffering of HTTP
18-Jul-2017 13:18:29                         # responses
18-Jul-2017 13:18:29                         r = low_conn.getresponse(buffering=True)
18-Jul-2017 13:18:29                     except TypeError:
18-Jul-2017 13:18:29                         # For compatibility with Python 2.6 versions and back
18-Jul-2017 13:18:29                         r = low_conn.getresponse()
18-Jul-2017 13:18:29    
18-Jul-2017 13:18:29                     resp = HTTPResponse.from_httplib(
18-Jul-2017 13:18:29                         r,
18-Jul-2017 13:18:29                         pool=conn,
18-Jul-2017 13:18:29                         connection=low_conn,
18-Jul-2017 13:18:29                         preload_content=False,
18-Jul-2017 13:18:29                         decode_content=False
18-Jul-2017 13:18:29                     )
18-Jul-2017 13:18:29                 except:
18-Jul-2017 13:18:29                     # If we hit any problems here, clean up the connection.
18-Jul-2017 13:18:29                     # Then, reraise so that we can handle the actual exception.
18-Jul-2017 13:18:29                     low_conn.close()
18-Jul-2017 13:18:29                     raise
18-Jul-2017 13:18:29    
18-Jul-2017 13:18:29         except (ProtocolError, socket.error) as err:
18-Jul-2017 13:18:29             raise ConnectionError(err, request=request)
18-Jul-2017 13:18:29    
18-Jul-2017 13:18:29         except MaxRetryError as e:
18-Jul-2017 13:18:29             if isinstance(e.reason, ConnectTimeoutError):
18-Jul-2017 13:18:29                 # TODO: Remove this in 3.0.0: see #2811
18-Jul-2017 13:18:29                 if not isinstance(e.reason, NewConnectionError):
18-Jul-2017 13:18:29                     raise ConnectTimeout(e, request=request)
18-Jul-2017 13:18:29    
18-Jul-2017 13:18:29             if isinstance(e.reason, ResponseError):
18-Jul-2017 13:18:29                 raise RetryError(e, request=request)
18-Jul-2017 13:18:29    
18-Jul-2017 13:18:29             if isinstance(e.reason, _ProxyError):
18-Jul-2017 13:18:29                 raise ProxyError(e, request=request)
18-Jul-2017 13:18:29    
18-Jul-2017 13:18:29 >           raise ConnectionError(e, request=request)
18-Jul-2017 13:18:29 E           ConnectionError: HTTPConnectionPool(host='localhost', port=9000): Max retries exceeded with url: /clowder/api/version (Caused by NewConnectionError('<requests.packages.urllib3.connection.HTTPConnection object at 0x7fbbe8bcb7d0>: Failed to establish a new connection: [Errno 111] Connection refused',))
18-Jul-2017 13:18:29
18-Jul-2017 13:18:29 /tmp/virtualenv/pyclowder2/local/lib/python2.7/site-packages/requests/adapters.py:467: ConnectionError
18-Jul-2017 13:18:29 --------------------------------- Captured log ---------------------------------
18-Jul-2017 13:18:29 client.py                   45 DEBUG    GET http://localhost:9000/clowder/api/version
18-Jul-2017 13:18:29 connectionpool.py          213 INFO     Starting new HTTP connection (1): localhost
18-Jul-2017 13:18:29 _______________________________ test_get_sensors _______________________________
18-Jul-2017 13:18:29
18-Jul-2017 13:18:29 caplog = <pytest_capturelog.CaptureLogFuncArg object at 0x7fbbe8981310>
18-Jul-2017 13:18:29 host = 'http://localhost:9000/clowder', key = 'r1ek3rs'
18-Jul-2017 13:18:29
18-Jul-2017 13:18:29     def test_get_sensors(caplog, host, key):
18-Jul-2017 13:18:29         caplog.setLevel(logging.DEBUG)
18-Jul-2017 13:18:29         client = SensorsApi(host=host, key=key)
18-Jul-2017 13:18:29         response = client.sensors_get()
18-Jul-2017 13:18:29 >       sensors = response.json()
18-Jul-2017 13:18:29 E       AttributeError: 'NoneType' object has no attribute 'json'
18-Jul-2017 13:18:29
18-Jul-2017 13:18:29 tests/test_geostreams.py:19: AttributeError
18-Jul-2017 13:18:29 --------------------------------- Captured log ---------------------------------
18-Jul-2017 13:18:29 sensors.py                  27 DEBUG    Getting all sensors
18-Jul-2017 13:18:29 connectionpool.py          213 INFO     Starting new HTTP connection (1): localhost
18-Jul-2017 13:18:29 sensors.py                  31 ERROR    Error retrieving sensor list: not enough arguments for format string
18-Jul-2017 13:18:29 ____________________________ test_raise_for_status _____________________________
18-Jul-2017 13:18:29
18-Jul-2017 13:18:29 caplog = <pytest_capturelog.CaptureLogFuncArg object at 0x7fbbe8c0d050>
18-Jul-2017 13:18:29 host = 'http://localhost:9000/clowder', key = 'r1ek3rs'
18-Jul-2017 13:18:29
18-Jul-2017 13:18:29     def test_raise_for_status(caplog, host, key):
18-Jul-2017 13:18:29         client = ClowderClient(host=host, key=key)
18-Jul-2017 13:18:29         try:
18-Jul-2017 13:18:29 >           client.get_json("this_path_does_not_exist")
18-Jul-2017 13:18:29
18-Jul-2017 13:18:29 tests/test_geostreams.py:27:
18-Jul-2017 13:18:29 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
18-Jul-2017 13:18:29 pyclowder/client.py:68: in get_json
18-Jul-2017 13:18:29     r = requests.get(url, headers=self.headers)
18-Jul-2017 13:18:29 /tmp/virtualenv/pyclowder2/local/lib/python2.7/site-packages/requests/api.py:71: in get
18-Jul-2017 13:18:29     return request('get', url, params=params, **kwargs)
18-Jul-2017 13:18:29 /tmp/virtualenv/pyclowder2/local/lib/python2.7/site-packages/requests/api.py:57: in request
18-Jul-2017 13:18:29     return session.request(method=method, url=url, **kwargs)
18-Jul-2017 13:18:29 /tmp/virtualenv/pyclowder2/local/lib/python2.7/site-packages/requests/sessions.py:475: in request
18-Jul-2017 13:18:29     resp = self.send(prep, **send_kwargs)
18-Jul-2017 13:18:29 /tmp/virtualenv/pyclowder2/local/lib/python2.7/site-packages/requests/sessions.py:585: in send
18-Jul-2017 13:18:29     r = adapter.send(request, **kwargs)
18-Jul-2017 13:18:29 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
18-Jul-2017 13:18:29
18-Jul-2017 13:18:29 self = <requests.adapters.HTTPAdapter object at 0x7fbbe8c06ed0>
18-Jul-2017 13:18:29 request = <PreparedRequest [GET]>, stream = False
18-Jul-2017 13:18:29 timeout = <requests.packages.urllib3.util.timeout.Timeout object at 0x7fbbe8bf0e10>
18-Jul-2017 13:18:29 verify = True, cert = None, proxies = OrderedDict()
18-Jul-2017 13:18:29
18-Jul-2017 13:18:29     def send(self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None):
18-Jul-2017 13:18:29         """Sends PreparedRequest object. Returns Response object.
18-Jul-2017 13:18:29    
18-Jul-2017 13:18:29             :param request: The :class:`PreparedRequest <PreparedRequest>` being sent.
18-Jul-2017 13:18:29             :param stream: (optional) Whether to stream the request content.
18-Jul-2017 13:18:29             :param timeout: (optional) How long to wait for the server to send
18-Jul-2017 13:18:29                 data before giving up, as a float, or a :ref:`(connect timeout,
18-Jul-2017 13:18:29                 read timeout) <timeouts>` tuple.
18-Jul-2017 13:18:29             :type timeout: float or tuple
18-Jul-2017 13:18:29             :param verify: (optional) Whether to verify SSL certificates.
18-Jul-2017 13:18:29             :param cert: (optional) Any user-provided SSL certificate to be trusted.
18-Jul-2017 13:18:29             :param proxies: (optional) The proxies dictionary to apply to the request.
18-Jul-2017 13:18:29             """
18-Jul-2017 13:18:29    
18-Jul-2017 13:18:29         conn = self.get_connection(request.url, proxies)
18-Jul-2017 13:18:29    
18-Jul-2017 13:18:29         self.cert_verify(conn, request.url, verify, cert)
18-Jul-2017 13:18:29         url = self.request_url(request, proxies)
18-Jul-2017 13:18:29         self.add_headers(request)
18-Jul-2017 13:18:29    
18-Jul-2017 13:18:29         chunked = not (request.body is None or 'Content-Length' in request.headers)
18-Jul-2017 13:18:29    
18-Jul-2017 13:18:29         if isinstance(timeout, tuple):
18-Jul-2017 13:18:29             try:
18-Jul-2017 13:18:29                 connect, read = timeout
18-Jul-2017 13:18:29                 timeout = TimeoutSauce(connect=connect, read=read)
18-Jul-2017 13:18:29             except ValueError as e:
18-Jul-2017 13:18:29                 # this may raise a string formatting error.
18-Jul-2017 13:18:29                 err = ("Invalid timeout {0}. Pass a (connect, read) "
18-Jul-2017 13:18:29                        "timeout tuple, or a single float to set "
18-Jul-2017 13:18:29                        "both timeouts to the same value".format(timeout))
18-Jul-2017 13:18:29                 raise ValueError(err)
18-Jul-2017 13:18:29         else:
18-Jul-2017 13:18:29             timeout = TimeoutSauce(connect=timeout, read=timeout)
18-Jul-2017 13:18:29    
18-Jul-2017 13:18:29         try:
18-Jul-2017 13:18:29             if not chunked:
18-Jul-2017 13:18:29                 resp = conn.urlopen(
18-Jul-2017 13:18:29                     method=request.method,
18-Jul-2017 13:18:29                     url=url,
18-Jul-2017 13:18:29                     body=request.body,
18-Jul-2017 13:18:29                     headers=request.headers,
18-Jul-2017 13:18:29                     redirect=False,
18-Jul-2017 13:18:29                     assert_same_host=False,
18-Jul-2017 13:18:29                     preload_content=False,
18-Jul-2017 13:18:29                     decode_content=False,
18-Jul-2017 13:18:29                     retries=self.max_retries,
18-Jul-2017 13:18:29                     timeout=timeout
18-Jul-2017 13:18:29                 )
18-Jul-2017 13:18:29    
18-Jul-2017 13:18:29             # Send the request.
18-Jul-2017 13:18:29             else:
18-Jul-2017 13:18:29                 if hasattr(conn, 'proxy_pool'):
18-Jul-2017 13:18:29                     conn = conn.proxy_pool
18-Jul-2017 13:18:29    
18-Jul-2017 13:18:29                 low_conn = conn._get_conn(timeout=DEFAULT_POOL_TIMEOUT)
18-Jul-2017 13:18:29    
18-Jul-2017 13:18:29                 try:
18-Jul-2017 13:18:29                     low_conn.putrequest(request.method,
18-Jul-2017 13:18:29                                         url,
18-Jul-2017 13:18:29                                         skip_accept_encoding=True)
18-Jul-2017 13:18:29    
18-Jul-2017 13:18:29                     for header, value in request.headers.items():
18-Jul-2017 13:18:29                         low_conn.putheader(header, value)
18-Jul-2017 13:18:29    
18-Jul-2017 13:18:29                     low_conn.endheaders()
18-Jul-2017 13:18:29    
18-Jul-2017 13:18:29                     for i in request.body:
18-Jul-2017 13:18:29                         low_conn.send(hex(len(i))[2:].encode('utf-8'))
18-Jul-2017 13:18:29                         low_conn.send(b'\r\n')
18-Jul-2017 13:18:29                         low_conn.send(i)
18-Jul-2017 13:18:29                         low_conn.send(b'\r\n')
18-Jul-2017 13:18:29                     low_conn.send(b'0\r\n\r\n')
18-Jul-2017 13:18:29    
18-Jul-2017 13:18:29                     # Receive the response from the server
18-Jul-2017 13:18:29                     try:
18-Jul-2017 13:18:29                         # For Python 2.7+ versions, use buffering of HTTP
18-Jul-2017 13:18:29                         # responses
18-Jul-2017 13:18:29                         r = low_conn.getresponse(buffering=True)
18-Jul-2017 13:18:29                     except TypeError:
18-Jul-2017 13:18:29                         # For compatibility with Python 2.6 versions and back
18-Jul-2017 13:18:29                         r = low_conn.getresponse()
18-Jul-2017 13:18:29    
18-Jul-2017 13:18:29                     resp = HTTPResponse.from_httplib(
18-Jul-2017 13:18:29                         r,
18-Jul-2017 13:18:29                         pool=conn,
18-Jul-2017 13:18:29                         connection=low_conn,
18-Jul-2017 13:18:29                         preload_content=False,
18-Jul-2017 13:18:29                         decode_content=False
18-Jul-2017 13:18:29                     )
18-Jul-2017 13:18:29                 except:
18-Jul-2017 13:18:29                     # If we hit any problems here, clean up the connection.
18-Jul-2017 13:18:29                     # Then, reraise so that we can handle the actual exception.
18-Jul-2017 13:18:29                     low_conn.close()
18-Jul-2017 13:18:29                     raise
18-Jul-2017 13:18:29    
18-Jul-2017 13:18:29         except (ProtocolError, socket.error) as err:
18-Jul-2017 13:18:29             raise ConnectionError(err, request=request)
18-Jul-2017 13:18:29    
18-Jul-2017 13:18:29         except MaxRetryError as e:
18-Jul-2017 13:18:29             if isinstance(e.reason, ConnectTimeoutError):
18-Jul-2017 13:18:29                 # TODO: Remove this in 3.0.0: see #2811
18-Jul-2017 13:18:29                 if not isinstance(e.reason, NewConnectionError):
18-Jul-2017 13:18:29                     raise ConnectTimeout(e, request=request)
18-Jul-2017 13:18:29    
18-Jul-2017 13:18:29             if isinstance(e.reason, ResponseError):
18-Jul-2017 13:18:29                 raise RetryError(e, request=request)
18-Jul-2017 13:18:29    
18-Jul-2017 13:18:29             if isinstance(e.reason, _ProxyError):
18-Jul-2017 13:18:29                 raise ProxyError(e, request=request)
18-Jul-2017 13:18:29    
18-Jul-2017 13:18:29 >           raise ConnectionError(e, request=request)
18-Jul-2017 13:18:29 E           ConnectionError: HTTPConnectionPool(host='localhost', port=9000): Max retries exceeded with url: /clowder/apithis_path_does_not_exist (Caused by NewConnectionError('<requests.packages.urllib3.connection.HTTPConnection object at 0x7fbbe8bf0dd0>: Failed to establish a new connection: [Errno 111] Connection refused',))
18-Jul-2017 13:18:29
18-Jul-2017 13:18:29 /tmp/virtualenv/pyclowder2/local/lib/python2.7/site-packages/requests/adapters.py:467: ConnectionError
18-Jul-2017 13:18:29 --------------------------------- Captured log ---------------------------------
18-Jul-2017 13:18:29 connectionpool.py          213 INFO     Starting new HTTP connection (1): localhost
18-Jul-2017 13:18:29 ______________________________ test_sensors_post _______________________________
18-Jul-2017 13:18:29
18-Jul-2017 13:18:29 caplog = <pytest_capturelog.CaptureLogFuncArg object at 0x7fbbe8c9df90>
18-Jul-2017 13:18:29 host = 'http://localhost:9000/clowder', key = 'r1ek3rs'
18-Jul-2017 13:18:29
18-Jul-2017 13:18:29     def test_sensors_post(caplog, host, key):
18-Jul-2017 13:18:29         global sensor_id
18-Jul-2017 13:18:29         caplog.setLevel(logging.DEBUG)
18-Jul-2017 13:18:29         client = SensorsApi(host=host, key=key)
18-Jul-2017 13:18:29         sensor_json = client.sensor_create_json("Test Sensor", 40.1149202, -88.2270582, 0, "", "ER")
18-Jul-2017 13:18:29         response = client.sensor_post(sensor_json)
18-Jul-2017 13:18:29 >       body = response.json()
18-Jul-2017 13:18:29 E       AttributeError: 'NoneType' object has no attribute 'json'
18-Jul-2017 13:18:29
18-Jul-2017 13:18:29 tests/test_sensors.py:13: AttributeError
18-Jul-2017 13:18:29 --------------------------------- Captured log ---------------------------------
18-Jul-2017 13:18:29 sensors.py                  67 DEBUG    Adding sensor
18-Jul-2017 13:18:29 connectionpool.py          213 INFO     Starting new HTTP connection (1): localhost
18-Jul-2017 13:18:29 client.py                  147 ERROR    POST http://localhost:9000/clowder/api/geostreams/sensors: HTTPConnectionPool(host='localhost', port=9000): Max retries exceeded with url: /clowder/api/geostreams/sensors?key=r1ek3rs (Caused by NewConnectionError('<requests.packages.urllib3.connection.HTTPConnection object at 0x7fbbe8981290>: Failed to establish a new connection: [Errno 111] Connection refused',))
18-Jul-2017 13:18:29 _______________________________ test_sensors_get _______________________________
18-Jul-2017 13:18:29
18-Jul-2017 13:18:29 caplog = <pytest_capturelog.CaptureLogFuncArg object at 0x7fbbe8c3e090>
18-Jul-2017 13:18:29 host = 'http://localhost:9000/clowder', key = 'r1ek3rs'
18-Jul-2017 13:18:29
18-Jul-2017 13:18:29     def test_sensors_get(caplog, host, key):
18-Jul-2017 13:18:29         global sensor_id
18-Jul-2017 13:18:29         caplog.setLevel(logging.DEBUG)
18-Jul-2017 13:18:29         client = SensorsApi(host=host, key=key)
18-Jul-2017 13:18:29         response = client.sensor_get(sensor_id)
18-Jul-2017 13:18:29 >       sensor = response.json()
18-Jul-2017 13:18:29 E       AttributeError: 'NoneType' object has no attribute 'json'
18-Jul-2017 13:18:29
18-Jul-2017 13:18:29 tests/test_sensors.py:24: AttributeError
18-Jul-2017 13:18:29 --------------------------------- Captured log ---------------------------------
18-Jul-2017 13:18:29 sensors.py                  40 DEBUG    Getting sensor
18-Jul-2017 13:18:29 connectionpool.py          213 INFO     Starting new HTTP connection (1): localhost
18-Jul-2017 13:18:29 sensors.py                  44 ERROR    Error retrieving sensor : not enough arguments for format string
18-Jul-2017 13:18:29 _____________________________ test_sensors_delete ______________________________
18-Jul-2017 13:18:29
18-Jul-2017 13:18:29 caplog = <pytest_capturelog.CaptureLogFuncArg object at 0x7fbbe8c55290>
18-Jul-2017 13:18:29 host = 'http://localhost:9000/clowder', key = 'r1ek3rs'
18-Jul-2017 13:18:29
18-Jul-2017 13:18:29     def test_sensors_delete(caplog, host, key):
18-Jul-2017 13:18:29         global sensor_id
18-Jul-2017 13:18:29         caplog.setLevel(logging.DEBUG)
18-Jul-2017 13:18:29         client = SensorsApi(host=host, key=key)
18-Jul-2017 13:18:29         response = client.sensor_delete(sensor_id)
18-Jul-2017 13:18:29 >       sensor = response.json()
18-Jul-2017 13:18:29 E       AttributeError: 'NoneType' object has no attribute 'json'
18-Jul-2017 13:18:29
18-Jul-2017 13:18:29 tests/test_sensors.py:34: AttributeError
18-Jul-2017 13:18:29 --------------------------------- Captured log ---------------------------------
18-Jul-2017 13:18:29 sensors.py                 101 DEBUG    Deleting sensor
18-Jul-2017 13:18:29 connectionpool.py          213 INFO     Starting new HTTP connection (1): localhost
18-Jul-2017 13:18:29 client.py                  211 ERROR    DELETE http://localhost:9000/clowder/api/geostreams/sensors/: HTTPConnectionPool(host='localhost', port=9000): Max retries exceeded with url: /clowder/api/geostreams/sensors/?key=r1ek3rs (Caused by NewConnectionError('<requests.packages.urllib3.connection.HTTPConnection object at 0x7fbbe8981450>: Failed to establish a new connection: [Errno 111] Connection refused',))
18-Jul-2017 13:18:29 __________________________________ PEP8-check __________________________________
18-Jul-2017 13:18:29 /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC26-JOB1/tests/test_streams.py:8:1: E302 expected 2 blank lines, found 1
18-Jul-2017 13:18:29 def test_streams_post(caplog, host, key):
18-Jul-2017 13:18:29 ^
18-Jul-2017 13:18:29 /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC26-JOB1/tests/test_streams.py:20:24: W291 trailing whitespace
18-Jul-2017 13:18:29     assert "id" in body
18-Jul-2017 13:18:29                        ^
18-Jul-2017 13:18:29 /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC26-JOB1/tests/test_streams.py:39:54: W291 trailing whitespace
18-Jul-2017 13:18:29     response = stream_client.stream_delete(stream_id)
18-Jul-2017 13:18:29                                                      ^
18-Jul-2017 13:18:29
18-Jul-2017 13:18:29 ______________________________ test_streams_post _______________________________
18-Jul-2017 13:18:29
18-Jul-2017 13:18:29 caplog = <pytest_capturelog.CaptureLogFuncArg object at 0x7fbbe88e7650>
18-Jul-2017 13:18:29 host = 'http://localhost:9000/clowder', key = 'r1ek3rs'
18-Jul-2017 13:18:29
18-Jul-2017 13:18:29     def test_streams_post(caplog, host, key):
18-Jul-2017 13:18:29         global sensor_id, stream_id
18-Jul-2017 13:18:29         caplog.setLevel(logging.DEBUG)
18-Jul-2017 13:18:29         sensor_client = SensorsApi(host=host, key=key)
18-Jul-2017 13:18:29         sensor_json = sensor_client.sensor_create_json("Test Sensor", 40.1149202, -88.2270582, 0, "", "ER")
18-Jul-2017 13:18:29         sensor_body = sensor_client.sensor_post_json(sensor_json)
18-Jul-2017 13:18:29 >       sensor_id = sensor_body['id']
18-Jul-2017 13:18:29 E       TypeError: 'NoneType' object has no attribute '__getitem__'
18-Jul-2017 13:18:29
18-Jul-2017 13:18:29 tests/test_streams.py:14: TypeError
18-Jul-2017 13:18:29 --------------------------------- Captured log ---------------------------------
18-Jul-2017 13:18:29 sensors.py                  80 DEBUG    Adding or getting sensor
18-Jul-2017 13:18:29 sensors.py                  53 DEBUG    Getting sensor Test Sensor
18-Jul-2017 13:18:29 connectionpool.py          213 INFO     Starting new HTTP connection (1): localhost
18-Jul-2017 13:18:29 sensors.py                  57 ERROR    Error retrieving sensor Test Sensor: not enough arguments for format string
18-Jul-2017 13:18:29 sensors.py                  92 ERROR    Error adding sensor Test Sensor: 'NoneType' object has no attribute 'json'
18-Jul-2017 13:18:29 _______________________________ test_streams_get _______________________________
18-Jul-2017 13:18:29
18-Jul-2017 13:18:29 caplog = <pytest_capturelog.CaptureLogFuncArg object at 0x7fbbe8c3e1d0>
18-Jul-2017 13:18:29 host = 'http://localhost:9000/clowder', key = 'r1ek3rs'
18-Jul-2017 13:18:29
18-Jul-2017 13:18:29     def test_streams_get(caplog, host, key):
18-Jul-2017 13:18:29         global sensor_id, stream_id
18-Jul-2017 13:18:29         caplog.setLevel(logging.DEBUG)
18-Jul-2017 13:18:29         stream_client = StreamsApi(host=host, key=key)
18-Jul-2017 13:18:29 >       stream = stream_client.stream_get_by_name("Test Sensor")
18-Jul-2017 13:18:29 E       AttributeError: 'StreamsApi' object has no attribute 'stream_get_by_name'
18-Jul-2017 13:18:29
18-Jul-2017 13:18:29 tests/test_streams.py:27: AttributeError
18-Jul-2017 13:18:29 _____________________________ test_streams_delete ______________________________
18-Jul-2017 13:18:29
18-Jul-2017 13:18:29 caplog = <pytest_capturelog.CaptureLogFuncArg object at 0x7fbbe8c3e850>
18-Jul-2017 13:18:29 host = 'http://localhost:9000/clowder', key = 'r1ek3rs'
18-Jul-2017 13:18:29
18-Jul-2017 13:18:29     def test_streams_delete(caplog, host, key):
18-Jul-2017 13:18:29         global sensor_id, stream_id
18-Jul-2017 13:18:29         caplog.setLevel(logging.DEBUG)
18-Jul-2017 13:18:29         sensor_client = SensorsApi(host=host, key=key)
18-Jul-2017 13:18:29         response = sensor_client.sensor_delete(sensor_id)
18-Jul-2017 13:18:29    
18-Jul-2017 13:18:29         stream_client = StreamsApi(host=host, key=key)
18-Jul-2017 13:18:29         response = stream_client.stream_delete(stream_id)
18-Jul-2017 13:18:29    
18-Jul-2017 13:18:29 >       stream = response.json()
18-Jul-2017 13:18:29 E       AttributeError: 'NoneType' object has no attribute 'json'
18-Jul-2017 13:18:29
18-Jul-2017 13:18:29 tests/test_streams.py:41: AttributeError
18-Jul-2017 13:18:29 --------------------------------- Captured log ---------------------------------
18-Jul-2017 13:18:29 sensors.py                 101 DEBUG    Deleting sensor
18-Jul-2017 13:18:29 connectionpool.py          213 INFO     Starting new HTTP connection (1): localhost
18-Jul-2017 13:18:29 client.py                  211 ERROR    DELETE http://localhost:9000/clowder/api/geostreams/sensors/: HTTPConnectionPool(host='localhost', port=9000): Max retries exceeded with url: /clowder/api/geostreams/sensors/?key=r1ek3rs (Caused by NewConnectionError('<requests.packages.urllib3.connection.HTTPConnection object at 0x7fbbe8976250>: Failed to establish a new connection: [Errno 111] Connection refused',))
18-Jul-2017 13:18:29 streams.py                  90 DEBUG    Deleting stream
18-Jul-2017 13:18:29 connectionpool.py          213 INFO     Starting new HTTP connection (1): localhost
18-Jul-2017 13:18:29 client.py                  211 ERROR    DELETE http://localhost:9000/clowder/api/geostreams/streams/: HTTPConnectionPool(host='localhost', port=9000): Max retries exceeded with url: /clowder/api/geostreams/streams/?key=r1ek3rs (Caused by NewConnectionError('<requests.packages.urllib3.connection.HTTPConnection object at 0x7fbbe8976390>: Failed to establish a new connection: [Errno 111] Connection refused',))
18-Jul-2017 13:18:29 ============================ pytest-warning summary ============================
18-Jul-2017 13:18:29 WI1 /tmp/virtualenv/pyclowder2/local/lib/python2.7/site-packages/pytest_capturelog.py:171 'pytest_runtest_makereport' hook uses deprecated __multicall__ argument
18-Jul-2017 13:18:29 WC1 None pytest_funcarg__caplog: declaring fixtures using "pytest_funcarg__" prefix is deprecated and scheduled to be removed in pytest 4.0.  Please remove the prefix and use the @pytest.fixture decorator instead.
18-Jul-2017 13:18:29 WC1 None pytest_funcarg__capturelog: declaring fixtures using "pytest_funcarg__" prefix is deprecated and scheduled to be removed in pytest 4.0.  Please remove the prefix and use the @pytest.fixture decorator instead.
18-Jul-2017 13:18:29 =========== 18 failed, 22 passed, 3 pytest-warnings in 0.60 seconds ============
18-Jul-2017 13:18:29 Failing task since return code of [/home/bamboo/bamboo-agent-home/temp/CATS-PYC26-JOB1-23-ScriptBuildTask-5964688587154337776.sh] was 1 while expected 0
18-Jul-2017 13:18:29 Finished task 'pytest' with result: Failed
18-Jul-2017 13:18:29 Starting task 'test results' of type 'com.atlassian.bamboo.plugins.testresultparser:task.testresultparser.junit'
18-Jul-2017 13:18:29 Parsing test results under /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC26-JOB1...
18-Jul-2017 13:18:29 Failing task since 18 failing test cases were found.
18-Jul-2017 13:18:29 Finished task 'test results' with result: Failed
18-Jul-2017 13:18:29 Running post build plugin 'Docker Container Cleanup'
18-Jul-2017 13:18:29 Running post build plugin 'NCover Results Collector'
18-Jul-2017 13:18:29 Running post build plugin 'Clover Results Collector'
18-Jul-2017 13:18:29 Running post build plugin 'npm Cache Cleanup'
18-Jul-2017 13:18:29 Running post build plugin 'Artifact Copier'
18-Jul-2017 13:18:29 Finalising the build...
18-Jul-2017 13:18:29 Stopping timer.
18-Jul-2017 13:18:29 Build CATS-PYC26-JOB1-23 completed.
18-Jul-2017 13:18:29 Running on server: post build plugin 'NCover Results Collector'
18-Jul-2017 13:18:29 Running on server: post build plugin 'Build Hanging Detection Configuration'
18-Jul-2017 13:18:29 Running on server: post build plugin 'Clover Delta Calculator'
18-Jul-2017 13:18:29 Running on server: post build plugin 'Maven Dependencies Postprocessor'
18-Jul-2017 13:18:29 All post build plugins have finished
18-Jul-2017 13:18:29 Generating build results summary...
18-Jul-2017 13:18:29 Saving build results to disk...
18-Jul-2017 13:18:29 Logging substituted variables...
18-Jul-2017 13:18:29 Indexing build results...
18-Jul-2017 13:18:29 Finished building CATS-PYC26-JOB1-23.