Next Generation of pyClowder
Build: #7 failed
Job: Default Job failed
Build log
The build generated 728 lines of output. Download or view full build log.
26-Jul-2017 10:51:45 | Build Clowder - pyclowder2 - seven-miles-changes - Default Job #7 (CATS-PYC245-JOB1-7) started building on agent buildserver-2.os.ncsa.edu |
26-Jul-2017 10:51:45 | Remote agent on host buildserver-2.os.ncsa.edu |
26-Jul-2017 10:51:45 | Build working directory is /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC245-JOB1 |
26-Jul-2017 10:51:45 | Executing build Clowder - pyclowder2 - seven-miles-changes - Default Job #7 (CATS-PYC245-JOB1-7) |
26-Jul-2017 10:51:45 | Starting task 'Checkout Default Repository' of type 'com.atlassian.bamboo.plugins.vcs:task.vcs.checkout' |
26-Jul-2017 10:51:45 | Updating source code to revision: 076160c339570015dcf4466f3d492001a11390ec |
26-Jul-2017 10:51:45 | Creating local git repository in '/home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC245-JOB1/.git'. |
26-Jul-2017 10:51:45 | Initialized empty Git repository in /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC245-JOB1/.git/ |
26-Jul-2017 10:51:45 | Fetching 'refs/heads/seven-miles-changes' from 'ssh://git@opensource.ncsa.illinois.edu:7999/cats/pyclowder2.git'. Will try to do a shallow fetch. |
26-Jul-2017 10:51:45 | Warning: Permanently added '[127.0.0.1]:34979' (RSA) to the list of known hosts. |
26-Jul-2017 10:51:45 | From ssh://127.0.0.1:34979/cats/pyclowder2 |
26-Jul-2017 10:51:45 | * [new branch] seven-miles-changes -> seven-miles-changes |
26-Jul-2017 10:51:45 | Checking out revision 076160c339570015dcf4466f3d492001a11390ec. |
26-Jul-2017 10:51:45 | Switched to branch 'seven-miles-changes' |
26-Jul-2017 10:51:46 | Updated source code to revision: 076160c339570015dcf4466f3d492001a11390ec |
26-Jul-2017 10:51:46 | Finished task 'Checkout Default Repository' with result: Success |
26-Jul-2017 10:51:46 | Running pre-build action: VCS Version Collector |
26-Jul-2017 10:51:46 | Starting task 'pytest' of type 'com.atlassian.bamboo.plugins.scripttask:task.builder.script' |
26-Jul-2017 10:51:46 | Beginning to execute external process for build 'Clowder - pyclowder2 - seven-miles-changes - Default Job #7 (CATS-PYC245-JOB1-7)' ... running command line: /home/bamboo/bamboo-agent-home/temp/CATS-PYC245-JOB1-7-ScriptBuildTask-7594916351186564715.sh ... in: /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC245-JOB1 ... using extra environment variables: bamboo_planRepository_1_branch=seven-miles-changes bamboo_capability_system_builder_node_Node_js_v6_9_1=/home/bamboo/node-v6.9.1/bin/node bamboo_capability_system_builder_command_npm_6=/home/bamboo/node-v6.9.1/bin/npm bamboo_capability_system_builder_command_buckminster_4_3=/home/bamboo/buckminster-4.3/buckminster bamboo_capability_system_builder_command_buckminster_4_2=/home/bamboo/buckminster-4.2/buckminster bamboo_planRepository_1_branchDisplayName=seven-miles-changes bamboo_repository_revision_number=076160c339570015dcf4466f3d492001a11390ec bamboo_resultsUrl=https://opensource.ncsa.illinois.edu/bamboo/browse/CATS-PYC245-JOB1-7 bamboo_capability_system_builder_command_sphinx=/usr/bin/sphinx-build bamboo_planRepository_1_name=pyclowder2 bamboo_build_working_directory=/home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC245-JOB1 bamboo_buildKey=CATS-PYC245-JOB1 bamboo_capability_system_os=linux bamboo_shortPlanName=seven-miles-changes bamboo_planRepository_name=pyclowder2 bamboo_buildNumber=7 bamboo_shortJobName=Default Job bamboo_buildResultsUrl=https://opensource.ncsa.illinois.edu/bamboo/browse/CATS-PYC245-JOB1-7 bamboo_planRepository_repositoryUrl=ssh://git@opensource.ncsa.illinois.edu:7999/cats/pyclowder2.git bamboo_agentId=140673028 bamboo_planName=Clowder - pyclowder2 - seven-miles-changes bamboo_shortPlanKey=PYC245 bamboo_capability_system_builder_command_sbt_0_13=/home/bamboo/sbt-0.13.2/bin/sbt bamboo_shortJobKey=JOB1 bamboo_capability_system_builder_node_Node_js_v0_10_28=/home/bamboo/node-v0.10.28/bin/node bamboo_planRepository_revision=076160c339570015dcf4466f3d492001a11390ec bamboo_repository_previous_revision_number=7c83cc93ab724570d20506fd8c212353faed5766 bamboo_buildTimeStamp=2017-07-26T10:51:45.546-05:00 bamboo_capability_system_builder_command_npm=/home/bamboo/node-v0.10.28/bin/npm bamboo_planRepository_previousRevision=7c83cc93ab724570d20506fd8c212353faed5766 bamboo_capability_system_builder_mvn2_Maven_2=/home/bamboo/apache-maven-2.2.1 bamboo_buildResultKey=CATS-PYC245-JOB1-7 bamboo_repository_git_branch=seven-miles-changes bamboo_repository_branch_name=seven-miles-changes bamboo_buildPlanName=Clowder - pyclowder2 - seven-miles-changes - Default Job bamboo_repository_148209695_branch_name=seven-miles-changes bamboo_planRepository_1_revision=076160c339570015dcf4466f3d492001a11390ec bamboo_capability_system_builder_command_python3=/usr/bin/python3 bamboo_repository_name=pyclowder2 bamboo_repository_148209695_name=pyclowder2 bamboo_buildFailed=false bamboo_capability_system_docker_executable=/usr/bin/docker bamboo_capability_system_builder_command_grunt=/home/bamboo/node-v0.10.28/bin/grunt bamboo_repository_148209695_revision_number=076160c339570015dcf4466f3d492001a11390ec bamboo_planRepository_branch=seven-miles-changes bamboo_repository_148209695_previous_revision_number=7c83cc93ab724570d20506fd8c212353faed5766 bamboo_agentWorkingDirectory=/home/bamboo/bamboo-agent-home/xml-data/build-dir bamboo_capability_system_git_executable=/usr/bin/git bamboo_planRepository_1_previousRevision=7c83cc93ab724570d20506fd8c212353faed5766 bamboo_repository_git_username= bamboo_capability_system_builder_sbt_SBT_0_13_13=/home/bamboo/sbt-0.13.13 bamboo_planRepository_branchDisplayName=seven-miles-changes bamboo_capability_system_builder_command_phantomjs=/home/bamboo/phantomjs-1.9.8/bin/phantomjs bamboo_planRepository_1_type=bbserver bamboo_planRepository_branchName=seven-miles-changes bamboo_capability_system_builder_command_python2_7=/usr/bin/python2.7 bamboo_capability_system_hostname=buildserver-1 bamboo_capability_system_jdk_JDK=/home/bamboo/jdk1.8.0_66 bamboo_repository_148209695_git_username= bamboo_capability_system_software_mongo=/usr/bin/mongo bamboo_plan_storageTag=plan-148013117 bamboo_capability_system_software_rabbitmq=/usr/sbin/rabbitmqctl bamboo_capability_system_builder_command_casperjs=/home/bamboo/node-v0.10.28/bin/casperjs bamboo_planRepository_type=bbserver bamboo_planRepository_1_username= bamboo_capability_system_jdk_JDK_1_8_0_66=/home/bamboo/jdk1.8.0_66 bamboo_repository_git_repositoryUrl=ssh://git@opensource.ncsa.illinois.edu:7999/cats/pyclowder2.git bamboo_capability_system_builder_node_Node_js=/home/bamboo/node-v0.10.28/bin/node bamboo_capability_system_builder_ant_Ant=/home/bamboo/apache-ant-1.9.4 bamboo_capability_system_builder_mvn3_Maven_3=/home/bamboo/apache-maven-3.3.9 bamboo_repository_148209695_git_branch=seven-miles-changes bamboo_working_directory=/home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC245-JOB1 bamboo_planKey=CATS-PYC245 bamboo_repository_148209695_git_repositoryUrl=ssh://git@opensource.ncsa.illinois.edu:7999/cats/pyclowder2.git bamboo_planRepository_1_repositoryUrl=ssh://git@opensource.ncsa.illinois.edu:7999/cats/pyclowder2.git bamboo_planRepository_username= bamboo_capability_system_jdk_JDK_1_8=/home/bamboo/jdk1.8.0_66 bamboo_capability_system_jdk_JDK_1_6=/home/bamboo/jdk1.6.0_41 bamboo_capability_system_builder_command_mkcrx=/home/bamboo/mkcrx/mkcrx.sh bamboo_capability_system_jdk_JDK_1_7=/home/bamboo/jdk1.7.0_60 bamboo_planRepository_1_branchName=seven-miles-changes |
26-Jul-2017 10:51:46 | New python executable in /tmp/virtualenv/pyclowder2/bin/python2 |
26-Jul-2017 10:51:46 | Also creating executable in /tmp/virtualenv/pyclowder2/bin/python |
26-Jul-2017 10:51:48 | Installing setuptools, pkg_resources, pip, wheel...done. |
26-Jul-2017 10:51:48 | Running virtualenv with interpreter /usr/bin/python2 |
26-Jul-2017 10:51:48 | Collecting enum34==1.1.6 (from -r requirements.txt (line 1)) |
26-Jul-2017 10:51:48 | Using cached enum34-1.1.6-py2-none-any.whl |
26-Jul-2017 10:51:48 | Collecting et-xmlfile==1.0.1 (from -r requirements.txt (line 2)) |
26-Jul-2017 10:51:49 | Collecting jdcal==1.3 (from -r requirements.txt (line 3)) |
26-Jul-2017 10:51:49 | Collecting openpyxl==2.4.1 (from -r requirements.txt (line 4)) |
26-Jul-2017 10:51:49 | Collecting pika==0.10.0 (from -r requirements.txt (line 5)) |
26-Jul-2017 10:51:49 | Using cached pika-0.10.0-py2.py3-none-any.whl |
26-Jul-2017 10:51:49 | Collecting python-dateutil==2.6.0 (from -r requirements.txt (line 6)) |
26-Jul-2017 10:51:49 | Using cached python_dateutil-2.6.0-py2.py3-none-any.whl |
26-Jul-2017 10:51:49 | Collecting pytz==2016.10 (from -r requirements.txt (line 7)) |
26-Jul-2017 10:51:50 | Using cached pytz-2016.10-py2.py3-none-any.whl |
26-Jul-2017 10:51:50 | Collecting PyYAML==3.11 (from -r requirements.txt (line 8)) |
26-Jul-2017 10:51:50 | Collecting requests==2.10.0 (from -r requirements.txt (line 9)) |
26-Jul-2017 10:51:50 | Using cached requests-2.10.0-py2.py3-none-any.whl |
26-Jul-2017 10:51:50 | Collecting six==1.10.0 (from -r requirements.txt (line 10)) |
26-Jul-2017 10:51:50 | Using cached six-1.10.0-py2.py3-none-any.whl |
26-Jul-2017 10:51:50 | Collecting wheel==0.24.0 (from -r requirements.txt (line 11)) |
26-Jul-2017 10:51:50 | Using cached wheel-0.24.0-py2.py3-none-any.whl |
26-Jul-2017 10:51:50 | Collecting pytest==3.0.3 (from -r requirements.txt (line 12)) |
26-Jul-2017 10:51:50 | Using cached pytest-3.0.3-py2.py3-none-any.whl |
26-Jul-2017 10:51:50 | Collecting pytest-pep8==1.0.6 (from -r requirements.txt (line 13)) |
26-Jul-2017 10:51:50 | Collecting pytest-capturelog==0.7 (from -r requirements.txt (line 14)) |
26-Jul-2017 10:51:50 | Collecting urllib3==1.14 (from -r requirements.txt (line 15)) |
26-Jul-2017 10:51:51 | Using cached urllib3-1.14-py2.py3-none-any.whl |
26-Jul-2017 10:51:51 | Collecting py>=1.4.29 (from pytest==3.0.3->-r requirements.txt (line 12)) |
26-Jul-2017 10:51:51 | Using cached py-1.4.34-py2.py3-none-any.whl |
26-Jul-2017 10:51:51 | Collecting pep8>=1.3 (from pytest-pep8==1.0.6->-r requirements.txt (line 13)) |
26-Jul-2017 10:51:51 | Using cached pep8-1.7.0-py2.py3-none-any.whl |
26-Jul-2017 10:51:51 | Collecting pytest-cache (from pytest-pep8==1.0.6->-r requirements.txt (line 13)) |
26-Jul-2017 10:51:51 | Collecting execnet>=1.1.dev1 (from pytest-cache->pytest-pep8==1.0.6->-r requirements.txt (line 13)) |
26-Jul-2017 10:51:51 | Using cached execnet-1.4.1-py2.py3-none-any.whl |
26-Jul-2017 10:51:51 | Collecting apipkg>=1.4 (from execnet>=1.1.dev1->pytest-cache->pytest-pep8==1.0.6->-r requirements.txt (line 13)) |
26-Jul-2017 10:51:51 | Using cached apipkg-1.4-py2.py3-none-any.whl |
26-Jul-2017 10:51:51 | Installing collected packages: enum34, et-xmlfile, jdcal, openpyxl, pika, six, python-dateutil, pytz, PyYAML, requests, wheel, py, pytest, pep8, apipkg, execnet, pytest-cache, pytest-pep8, pytest-capturelog, urllib3 |
26-Jul-2017 10:51:51 | Found existing installation: wheel 0.30.0a0 |
26-Jul-2017 10:51:51 | Uninstalling wheel-0.30.0a0: |
26-Jul-2017 10:51:51 | Successfully uninstalled wheel-0.30.0a0 |
26-Jul-2017 10:51:52 | Successfully installed PyYAML-3.11 apipkg-1.4 enum34-1.1.6 et-xmlfile-1.0.1 execnet-1.4.1 jdcal-1.3 openpyxl-2.4.1 pep8-1.7.0 pika-0.10.0 py-1.4.34 pytest-3.0.3 pytest-cache-1.0 pytest-capturelog-0.7 pytest-pep8-1.0.6 python-dateutil-2.6.0 pytz-2016.10 requests-2.10.0 six-1.10.0 urllib3-1.14 wheel-0.24.0 |
26-Jul-2017 10:51:52 | ============================= test session starts ============================== |
26-Jul-2017 10:51:52 | platform linux2 -- Python 2.7.12, pytest-3.0.3, py-1.4.34, pluggy-0.4.0 |
26-Jul-2017 10:51:52 | rootdir: /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC245-JOB1, inifile: setup.cfg |
26-Jul-2017 10:51:52 | plugins: pep8-1.0.6, capturelog-0.7 |
26-Jul-2017 10:51:52 | collected 39 items |
26-Jul-2017 10:51:52 | |
26-Jul-2017 10:51:52 | setup.py . |
26-Jul-2017 10:51:52 | docs/source/conf.py F |
26-Jul-2017 10:51:52 | pyclowder/__init__.py . |
26-Jul-2017 10:51:52 | pyclowder/client.py F |
26-Jul-2017 10:51:52 | pyclowder/collections.py . |
26-Jul-2017 10:51:52 | pyclowder/connectors.py . |
26-Jul-2017 10:51:52 | pyclowder/datasets.py . |
26-Jul-2017 10:51:52 | pyclowder/extractors.py . |
26-Jul-2017 10:51:53 | pyclowder/files.py . |
26-Jul-2017 10:51:53 | pyclowder/sections.py . |
26-Jul-2017 10:51:53 | pyclowder/utils.py . |
26-Jul-2017 10:51:53 | pyclowder/geostreams/__init__.py . |
26-Jul-2017 10:51:53 | pyclowder/geostreams/cache.py . |
26-Jul-2017 10:51:53 | pyclowder/geostreams/csv.py . |
26-Jul-2017 10:51:53 | pyclowder/geostreams/datapoints.py F |
26-Jul-2017 10:51:53 | pyclowder/geostreams/datasets.py . |
26-Jul-2017 10:51:53 | pyclowder/geostreams/geocode_convert.py F |
26-Jul-2017 10:51:53 | pyclowder/geostreams/map_names.py . |
26-Jul-2017 10:51:53 | pyclowder/geostreams/sensors.py F |
26-Jul-2017 10:51:53 | pyclowder/geostreams/streams.py . |
26-Jul-2017 10:51:53 | pyclowder/geostreams/time_transformers.py F |
26-Jul-2017 10:51:53 | sample-extractors/echo/echo.py . |
26-Jul-2017 10:51:53 | sample-extractors/wordcount/wordcount.py . |
26-Jul-2017 10:51:53 | tests/__init__.py . |
26-Jul-2017 10:51:53 | tests/conftest.py F |
26-Jul-2017 10:51:53 | tests/test_datapoints.py FF |
26-Jul-2017 10:51:53 | tests/test_geostreams.py .FFF |
26-Jul-2017 10:51:53 | tests/test_sensors.py .FFF |
26-Jul-2017 10:51:53 | tests/test_streams.py FFFF |
26-Jul-2017 10:51:53 | |
26-Jul-2017 10:51:53 | generated xml file: /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC245-JOB1/test-reports/results.xml |
26-Jul-2017 10:51:53 | =================================== FAILURES =================================== |
26-Jul-2017 10:51:53 | __________________________________ PEP8-check __________________________________ |
26-Jul-2017 10:51:53 | /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC245-JOB1/docs/source/conf.py:34:5: E128 continuation line under-indented for visual indent |
26-Jul-2017 10:51:53 | 'sphinx.ext.doctest', |
26-Jul-2017 10:51:53 | ^ |
26-Jul-2017 10:51:53 | /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC245-JOB1/docs/source/conf.py:35:5: E128 continuation line under-indented for visual indent |
26-Jul-2017 10:51:53 | 'sphinx.ext.viewcode', |
26-Jul-2017 10:51:53 | ^ |
26-Jul-2017 10:51:53 | /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC245-JOB1/docs/source/conf.py:36:5: E128 continuation line under-indented for visual indent |
26-Jul-2017 10:51:53 | 'sphinx.ext.githubpages'] |
26-Jul-2017 10:51:53 | ^ |
26-Jul-2017 10:51:53 | /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC245-JOB1/docs/source/conf.py:159:1: W391 blank line at end of file |
26-Jul-2017 10:51:53 | |
26-Jul-2017 10:51:53 | ^ |
26-Jul-2017 10:51:53 | |
26-Jul-2017 10:51:53 | __________________________________ PEP8-check __________________________________ |
26-Jul-2017 10:51:53 | /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC245-JOB1/pyclowder/client.py:145:121: E501 line too long (137 > 120 characters) |
26-Jul-2017 10:51:53 | return requests.post(url, params=params, data=json.dumps(content), headers=self.headers, auth=(self.username, self.password)) |
26-Jul-2017 10:51:53 | ^ |
26-Jul-2017 10:51:53 | /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC245-JOB1/pyclowder/client.py:163:121: E501 line too long (127 > 120 characters) |
26-Jul-2017 10:51:53 | return requests.post(url, params=params, files={"File": open(filename, 'rb')}, auth=(self.username, self.password)) |
26-Jul-2017 10:51:53 | ^ |
26-Jul-2017 10:51:53 | |
26-Jul-2017 10:51:53 | __________________________________ PEP8-check __________________________________ |
26-Jul-2017 10:51:53 | /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC245-JOB1/pyclowder/geostreams/datapoints.py:74:121: E501 line too long (181 > 120 characters) |
26-Jul-2017 10:51:53 | def datapoint_create_json(self, start_time, end_time, longitude, latitude, sensor_id, stream_id, sensor_name, properties, owner=None, source=None, procedures=None, elevation=0): |
26-Jul-2017 10:51:53 | ^ |
26-Jul-2017 10:51:53 | /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC245-JOB1/pyclowder/geostreams/datapoints.py:78:1: W293 blank line contains whitespace |
26-Jul-2017 10:51:53 | |
26-Jul-2017 10:51:53 | ^ |
26-Jul-2017 10:51:53 | /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC245-JOB1/pyclowder/geostreams/datapoints.py:79:27: W291 trailing whitespace |
26-Jul-2017 10:51:53 | :param start_time: |
26-Jul-2017 10:51:53 | ^ |
26-Jul-2017 10:51:53 | /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC245-JOB1/pyclowder/geostreams/datapoints.py:80:25: W291 trailing whitespace |
26-Jul-2017 10:51:53 | :param end_time: |
26-Jul-2017 10:51:53 | ^ |
26-Jul-2017 10:51:53 | /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC245-JOB1/pyclowder/geostreams/datapoints.py:81:26: W291 trailing whitespace |
26-Jul-2017 10:51:53 | :param longitude: |
26-Jul-2017 10:51:53 | ^ |
26-Jul-2017 10:51:53 | /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC245-JOB1/pyclowder/geostreams/datapoints.py:82:25: W291 trailing whitespace |
26-Jul-2017 10:51:53 | :param latitude: |
26-Jul-2017 10:51:53 | ^ |
26-Jul-2017 10:51:53 | /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC245-JOB1/pyclowder/geostreams/datapoints.py:83:26: W291 trailing whitespace |
26-Jul-2017 10:51:53 | :param sensor_id: |
26-Jul-2017 10:51:53 | ^ |
26-Jul-2017 10:51:53 | /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC245-JOB1/pyclowder/geostreams/datapoints.py:84:26: W291 trailing whitespace |
26-Jul-2017 10:51:53 | :param stream_id: |
26-Jul-2017 10:51:53 | ^ |
26-Jul-2017 10:51:53 | /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC245-JOB1/pyclowder/geostreams/datapoints.py:85:28: W291 trailing whitespace |
26-Jul-2017 10:51:53 | :param sensor_name: |
26-Jul-2017 10:51:53 | ^ |
26-Jul-2017 10:51:53 | /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC245-JOB1/pyclowder/geostreams/datapoints.py:86:27: W291 trailing whitespace |
26-Jul-2017 10:51:53 | :param properties: |
26-Jul-2017 10:51:53 | ^ |
26-Jul-2017 10:51:53 | /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC245-JOB1/pyclowder/geostreams/datapoints.py:87:22: W291 trailing whitespace |
26-Jul-2017 10:51:53 | :param owner: |
26-Jul-2017 10:51:53 | ^ |
26-Jul-2017 10:51:53 | /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC245-JOB1/pyclowder/geostreams/datapoints.py:88:23: W291 trailing whitespace |
26-Jul-2017 10:51:53 | :param source: |
26-Jul-2017 10:51:53 | ^ |
26-Jul-2017 10:51:53 | /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC245-JOB1/pyclowder/geostreams/datapoints.py:89:27: W291 trailing whitespace |
26-Jul-2017 10:51:53 | :param procedures: |
26-Jul-2017 10:51:53 | ^ |
26-Jul-2017 10:51:53 | /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC245-JOB1/pyclowder/geostreams/datapoints.py:90:26: W291 trailing whitespace |
26-Jul-2017 10:51:53 | :param elevation: |
26-Jul-2017 10:51:53 | ^ |
26-Jul-2017 10:51:53 | /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC245-JOB1/pyclowder/geostreams/datapoints.py:91:17: W291 trailing whitespace |
26-Jul-2017 10:51:53 | :return: |
26-Jul-2017 10:51:53 | ^ |
26-Jul-2017 10:51:53 | |
26-Jul-2017 10:51:53 | __________________________________ PEP8-check __________________________________ |
26-Jul-2017 10:51:53 | /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC245-JOB1/pyclowder/geostreams/geocode_convert.py:6:1: E302 expected 2 blank lines, found 0 |
26-Jul-2017 10:51:53 | def dms2dec(dms): |
26-Jul-2017 10:51:53 | ^ |
26-Jul-2017 10:51:53 | /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC245-JOB1/pyclowder/geostreams/geocode_convert.py:7:38: E251 unexpected spaces around keyword / parameter equals |
26-Jul-2017 10:51:53 | dec = list(LatLon(Latitude(degree = dms[0][0], minute = dms[0][1], second = dms[0][2]), Longitude(degree = dms[1][0], minute = dms[1][1], second = dms[1][2])).to_string()) |
26-Jul-2017 10:51:53 | ^ |
26-Jul-2017 10:51:53 | /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC245-JOB1/pyclowder/geostreams/geocode_convert.py:7:40: E251 unexpected spaces around keyword / parameter equals |
26-Jul-2017 10:51:53 | dec = list(LatLon(Latitude(degree = dms[0][0], minute = dms[0][1], second = dms[0][2]), Longitude(degree = dms[1][0], minute = dms[1][1], second = dms[1][2])).to_string()) |
26-Jul-2017 10:51:53 | ^ |
26-Jul-2017 10:51:53 | /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC245-JOB1/pyclowder/geostreams/geocode_convert.py:7:58: E251 unexpected spaces around keyword / parameter equals |
26-Jul-2017 10:51:53 | dec = list(LatLon(Latitude(degree = dms[0][0], minute = dms[0][1], second = dms[0][2]), Longitude(degree = dms[1][0], minute = dms[1][1], second = dms[1][2])).to_string()) |
26-Jul-2017 10:51:53 | ^ |
26-Jul-2017 10:51:53 | /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC245-JOB1/pyclowder/geostreams/geocode_convert.py:7:60: E251 unexpected spaces around keyword / parameter equals |
26-Jul-2017 10:51:53 | dec = list(LatLon(Latitude(degree = dms[0][0], minute = dms[0][1], second = dms[0][2]), Longitude(degree = dms[1][0], minute = dms[1][1], second = dms[1][2])).to_string()) |
26-Jul-2017 10:51:53 | ^ |
26-Jul-2017 10:51:53 | /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC245-JOB1/pyclowder/geostreams/geocode_convert.py:7:78: E251 unexpected spaces around keyword / parameter equals |
26-Jul-2017 10:51:53 | dec = list(LatLon(Latitude(degree = dms[0][0], minute = dms[0][1], second = dms[0][2]), Longitude(degree = dms[1][0], minute = dms[1][1], second = dms[1][2])).to_string()) |
26-Jul-2017 10:51:53 | ^ |
26-Jul-2017 10:51:53 | /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC245-JOB1/pyclowder/geostreams/geocode_convert.py:7:80: E251 unexpected spaces around keyword / parameter equals |
26-Jul-2017 10:51:53 | dec = list(LatLon(Latitude(degree = dms[0][0], minute = dms[0][1], second = dms[0][2]), Longitude(degree = dms[1][0], minute = dms[1][1], second = dms[1][2])).to_string()) |
26-Jul-2017 10:51:53 | ^ |
26-Jul-2017 10:51:53 | /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC245-JOB1/pyclowder/geostreams/geocode_convert.py:7:109: E251 unexpected spaces around keyword / parameter equals |
26-Jul-2017 10:51:53 | dec = list(LatLon(Latitude(degree = dms[0][0], minute = dms[0][1], second = dms[0][2]), Longitude(degree = dms[1][0], minute = dms[1][1], second = dms[1][2])).to_string()) |
26-Jul-2017 10:51:53 | ^ |
26-Jul-2017 10:51:53 | /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC245-JOB1/pyclowder/geostreams/geocode_convert.py:7:111: E251 unexpected spaces around keyword / parameter equals |
26-Jul-2017 10:51:53 | dec = list(LatLon(Latitude(degree = dms[0][0], minute = dms[0][1], second = dms[0][2]), Longitude(degree = dms[1][0], minute = dms[1][1], second = dms[1][2])).to_string()) |
26-Jul-2017 10:51:53 | ^ |
26-Jul-2017 10:51:53 | /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC245-JOB1/pyclowder/geostreams/geocode_convert.py:7:121: E501 line too long (175 > 120 characters) |
26-Jul-2017 10:51:53 | dec = list(LatLon(Latitude(degree = dms[0][0], minute = dms[0][1], second = dms[0][2]), Longitude(degree = dms[1][0], minute = dms[1][1], second = dms[1][2])).to_string()) |
26-Jul-2017 10:51:53 | ^ |
26-Jul-2017 10:51:53 | /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC245-JOB1/pyclowder/geostreams/geocode_convert.py:7:129: E251 unexpected spaces around keyword / parameter equals |
26-Jul-2017 10:51:53 | dec = list(LatLon(Latitude(degree = dms[0][0], minute = dms[0][1], second = dms[0][2]), Longitude(degree = dms[1][0], minute = dms[1][1], second = dms[1][2])).to_string()) |
26-Jul-2017 10:51:53 | ^ |
26-Jul-2017 10:51:53 | /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC245-JOB1/pyclowder/geostreams/geocode_convert.py:7:131: E251 unexpected spaces around keyword / parameter equals |
26-Jul-2017 10:51:53 | dec = list(LatLon(Latitude(degree = dms[0][0], minute = dms[0][1], second = dms[0][2]), Longitude(degree = dms[1][0], minute = dms[1][1], second = dms[1][2])).to_string()) |
26-Jul-2017 10:51:53 | ^ |
26-Jul-2017 10:51:53 | /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC245-JOB1/pyclowder/geostreams/geocode_convert.py:7:149: E251 unexpected spaces around keyword / parameter equals |
26-Jul-2017 10:51:53 | dec = list(LatLon(Latitude(degree = dms[0][0], minute = dms[0][1], second = dms[0][2]), Longitude(degree = dms[1][0], minute = dms[1][1], second = dms[1][2])).to_string()) |
26-Jul-2017 10:51:53 | ^ |
26-Jul-2017 10:51:53 | /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC245-JOB1/pyclowder/geostreams/geocode_convert.py:7:151: E251 unexpected spaces around keyword / parameter equals |
26-Jul-2017 10:51:53 | dec = list(LatLon(Latitude(degree = dms[0][0], minute = dms[0][1], second = dms[0][2]), Longitude(degree = dms[1][0], minute = dms[1][1], second = dms[1][2])).to_string()) |
26-Jul-2017 10:51:53 | ^ |
26-Jul-2017 10:51:53 | /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC245-JOB1/pyclowder/geostreams/geocode_convert.py:9:26: E231 missing whitespace after ',' |
26-Jul-2017 10:51:53 | return [float(dec[0]),float(dec[1])] |
26-Jul-2017 10:51:53 | ^ |
26-Jul-2017 10:51:53 | |
26-Jul-2017 10:51:53 | __________________________________ PEP8-check __________________________________ |
26-Jul-2017 10:51:53 | /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC245-JOB1/pyclowder/geostreams/sensors.py:69:12: E111 indentation is not a multiple of four |
26-Jul-2017 10:51:53 | return self.client.post("/geostreams/sensors", sensor) |
26-Jul-2017 10:51:53 | ^ |
26-Jul-2017 10:51:53 | |
26-Jul-2017 10:51:53 | __________________________________ PEP8-check __________________________________ |
26-Jul-2017 10:51:53 | /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC245-JOB1/pyclowder/geostreams/time_transformers.py:2:30: E231 missing whitespace after ',' |
26-Jul-2017 10:51:53 | from datetime import datetime,timedelta |
26-Jul-2017 10:51:53 | ^ |
26-Jul-2017 10:51:53 | /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC245-JOB1/pyclowder/geostreams/time_transformers.py:14:1: E302 expected 2 blank lines, found 1 |
26-Jul-2017 10:51:53 | def julian_day_to_month_day(year,julian_day): |
26-Jul-2017 10:51:53 | ^ |
26-Jul-2017 10:51:53 | /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC245-JOB1/pyclowder/geostreams/time_transformers.py:14:33: E231 missing whitespace after ',' |
26-Jul-2017 10:51:53 | def julian_day_to_month_day(year,julian_day): |
26-Jul-2017 10:51:53 | ^ |
26-Jul-2017 10:51:53 | /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC245-JOB1/pyclowder/geostreams/time_transformers.py:20:1: E302 expected 2 blank lines, found 1 |
26-Jul-2017 10:51:53 | def calendar_date2utc(date_in,time_zone="America/Chicago"): |
26-Jul-2017 10:51:53 | ^ |
26-Jul-2017 10:51:53 | /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC245-JOB1/pyclowder/geostreams/time_transformers.py:20:30: E231 missing whitespace after ',' |
26-Jul-2017 10:51:53 | def calendar_date2utc(date_in,time_zone="America/Chicago"): |
26-Jul-2017 10:51:53 | ^ |
26-Jul-2017 10:51:53 | |
26-Jul-2017 10:51:53 | __________________________________ PEP8-check __________________________________ |
26-Jul-2017 10:51:53 | /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC245-JOB1/tests/conftest.py:14:1: E302 expected 2 blank lines, found 1 |
26-Jul-2017 10:51:53 | @pytest.fixture(scope="module") |
26-Jul-2017 10:51:53 | ^ |
26-Jul-2017 10:51:53 | |
26-Jul-2017 10:51:53 | __________________________________ PEP8-check __________________________________ |
26-Jul-2017 10:51:53 | /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC245-JOB1/tests/test_datapoints.py:11:39: W292 no newline at end of file |
26-Jul-2017 10:51:53 | assert response.status_code != 200 |
26-Jul-2017 10:51:53 | ^ |
26-Jul-2017 10:51:53 | |
26-Jul-2017 10:51:53 | _____________________ test_datapoints_count_by_sensor_get ______________________ |
26-Jul-2017 10:51:53 | |
26-Jul-2017 10:51:53 | caplog = <pytest_capturelog.CaptureLogFuncArg object at 0x7fb5fecdd290> |
26-Jul-2017 10:51:53 | host = 'http://localhost:9000/clowder', key = 'r1ek3rs' |
26-Jul-2017 10:51:53 | |
26-Jul-2017 10:51:53 | def test_datapoints_count_by_sensor_get(caplog, host, key): |
26-Jul-2017 10:51:53 | caplog.setLevel(logging.DEBUG) |
26-Jul-2017 10:51:53 | client = DatapointsApi(host=host, key=key) |
26-Jul-2017 10:51:53 | response = client.datapoints_count_by_sensor_get(950) |
26-Jul-2017 10:51:53 | > sensors = response.text |
26-Jul-2017 10:51:53 | E AttributeError: 'NoneType' object has no attribute 'text' |
26-Jul-2017 10:51:53 | |
26-Jul-2017 10:51:53 | tests/test_datapoints.py:9: AttributeError |
26-Jul-2017 10:51:53 | --------------------------------- Captured log --------------------------------- |
26-Jul-2017 10:51:53 | datapoints.py 41 DEBUG Counting datapoints by sensor |
26-Jul-2017 10:51:53 | connectionpool.py 213 INFO Starting new HTTP connection (1): localhost |
26-Jul-2017 10:51:53 | datapoints.py 45 ERROR Error counting datapoints by sensor 950: not enough arguments for format string |
26-Jul-2017 10:51:53 | _________________________________ test_version _________________________________ |
26-Jul-2017 10:51:53 | |
26-Jul-2017 10:51:53 | caplog = <pytest_capturelog.CaptureLogFuncArg object at 0x7fb5fec61a90> |
26-Jul-2017 10:51:53 | host = 'http://localhost:9000/clowder', key = 'r1ek3rs' |
26-Jul-2017 10:51:53 | |
26-Jul-2017 10:51:53 | def test_version(caplog, host, key): |
26-Jul-2017 10:51:53 | caplog.setLevel(logging.DEBUG) |
26-Jul-2017 10:51:53 | client = ClowderClient(host=host, key=key) |
26-Jul-2017 10:51:53 | > version = client.version() |
26-Jul-2017 10:51:53 | |
26-Jul-2017 10:51:53 | tests/test_geostreams.py:10: |
26-Jul-2017 10:51:53 | _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ |
26-Jul-2017 10:51:53 | pyclowder/client.py:46: in version |
26-Jul-2017 10:51:53 | r = requests.get(url) |
26-Jul-2017 10:51:53 | /tmp/virtualenv/pyclowder2/local/lib/python2.7/site-packages/requests/api.py:71: in get |
26-Jul-2017 10:51:53 | return request('get', url, params=params, **kwargs) |
26-Jul-2017 10:51:53 | /tmp/virtualenv/pyclowder2/local/lib/python2.7/site-packages/requests/api.py:57: in request |
26-Jul-2017 10:51:53 | return session.request(method=method, url=url, **kwargs) |
26-Jul-2017 10:51:53 | /tmp/virtualenv/pyclowder2/local/lib/python2.7/site-packages/requests/sessions.py:475: in request |
26-Jul-2017 10:51:53 | resp = self.send(prep, **send_kwargs) |
26-Jul-2017 10:51:53 | /tmp/virtualenv/pyclowder2/local/lib/python2.7/site-packages/requests/sessions.py:585: in send |
26-Jul-2017 10:51:53 | r = adapter.send(request, **kwargs) |
26-Jul-2017 10:51:53 | _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ |
26-Jul-2017 10:51:53 | |
26-Jul-2017 10:51:53 | self = <requests.adapters.HTTPAdapter object at 0x7fb5fea1c090> |
26-Jul-2017 10:51:53 | request = <PreparedRequest [GET]>, stream = False |
26-Jul-2017 10:51:53 | timeout = <requests.packages.urllib3.util.timeout.Timeout object at 0x7fb5fec61690> |
26-Jul-2017 10:51:53 | verify = True, cert = None, proxies = OrderedDict() |
26-Jul-2017 10:51:53 | |
26-Jul-2017 10:51:53 | def send(self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None): |
26-Jul-2017 10:51:53 | """Sends PreparedRequest object. Returns Response object. |
26-Jul-2017 10:51:53 | |
26-Jul-2017 10:51:53 | :param request: The :class:`PreparedRequest <PreparedRequest>` being sent. |
26-Jul-2017 10:51:53 | :param stream: (optional) Whether to stream the request content. |
26-Jul-2017 10:51:53 | :param timeout: (optional) How long to wait for the server to send |
26-Jul-2017 10:51:53 | data before giving up, as a float, or a :ref:`(connect timeout, |
26-Jul-2017 10:51:53 | read timeout) <timeouts>` tuple. |
26-Jul-2017 10:51:53 | :type timeout: float or tuple |
26-Jul-2017 10:51:53 | :param verify: (optional) Whether to verify SSL certificates. |
26-Jul-2017 10:51:53 | :param cert: (optional) Any user-provided SSL certificate to be trusted. |
26-Jul-2017 10:51:53 | :param proxies: (optional) The proxies dictionary to apply to the request. |
26-Jul-2017 10:51:53 | """ |
26-Jul-2017 10:51:53 | |
26-Jul-2017 10:51:53 | conn = self.get_connection(request.url, proxies) |
26-Jul-2017 10:51:53 | |
26-Jul-2017 10:51:53 | self.cert_verify(conn, request.url, verify, cert) |
26-Jul-2017 10:51:53 | url = self.request_url(request, proxies) |
26-Jul-2017 10:51:53 | self.add_headers(request) |
26-Jul-2017 10:51:53 | |
26-Jul-2017 10:51:53 | chunked = not (request.body is None or 'Content-Length' in request.headers) |
26-Jul-2017 10:51:53 | |
26-Jul-2017 10:51:53 | if isinstance(timeout, tuple): |
26-Jul-2017 10:51:53 | try: |
26-Jul-2017 10:51:53 | connect, read = timeout |
26-Jul-2017 10:51:53 | timeout = TimeoutSauce(connect=connect, read=read) |
26-Jul-2017 10:51:53 | except ValueError as e: |
26-Jul-2017 10:51:53 | # this may raise a string formatting error. |
26-Jul-2017 10:51:53 | err = ("Invalid timeout {0}. Pass a (connect, read) " |
26-Jul-2017 10:51:53 | "timeout tuple, or a single float to set " |
26-Jul-2017 10:51:53 | "both timeouts to the same value".format(timeout)) |
26-Jul-2017 10:51:53 | raise ValueError(err) |
26-Jul-2017 10:51:53 | else: |
26-Jul-2017 10:51:53 | timeout = TimeoutSauce(connect=timeout, read=timeout) |
26-Jul-2017 10:51:53 | |
26-Jul-2017 10:51:53 | try: |
26-Jul-2017 10:51:53 | if not chunked: |
26-Jul-2017 10:51:53 | resp = conn.urlopen( |
26-Jul-2017 10:51:53 | method=request.method, |
26-Jul-2017 10:51:53 | url=url, |
26-Jul-2017 10:51:53 | body=request.body, |
26-Jul-2017 10:51:53 | headers=request.headers, |
26-Jul-2017 10:51:53 | redirect=False, |
26-Jul-2017 10:51:53 | assert_same_host=False, |
26-Jul-2017 10:51:53 | preload_content=False, |
26-Jul-2017 10:51:53 | decode_content=False, |
26-Jul-2017 10:51:53 | retries=self.max_retries, |
26-Jul-2017 10:51:53 | timeout=timeout |
26-Jul-2017 10:51:53 | ) |
26-Jul-2017 10:51:53 | |
26-Jul-2017 10:51:53 | # Send the request. |
26-Jul-2017 10:51:53 | else: |
26-Jul-2017 10:51:53 | if hasattr(conn, 'proxy_pool'): |
26-Jul-2017 10:51:53 | conn = conn.proxy_pool |
26-Jul-2017 10:51:53 | |
26-Jul-2017 10:51:53 | low_conn = conn._get_conn(timeout=DEFAULT_POOL_TIMEOUT) |
26-Jul-2017 10:51:53 | |
26-Jul-2017 10:51:53 | try: |
26-Jul-2017 10:51:53 | low_conn.putrequest(request.method, |
26-Jul-2017 10:51:53 | url, |
26-Jul-2017 10:51:53 | skip_accept_encoding=True) |
26-Jul-2017 10:51:53 | |
26-Jul-2017 10:51:53 | for header, value in request.headers.items(): |
26-Jul-2017 10:51:53 | low_conn.putheader(header, value) |
26-Jul-2017 10:51:53 | |
26-Jul-2017 10:51:53 | low_conn.endheaders() |
26-Jul-2017 10:51:53 | |
26-Jul-2017 10:51:53 | for i in request.body: |
26-Jul-2017 10:51:53 | low_conn.send(hex(len(i))[2:].encode('utf-8')) |
26-Jul-2017 10:51:53 | low_conn.send(b'\r\n') |
26-Jul-2017 10:51:53 | low_conn.send(i) |
26-Jul-2017 10:51:53 | low_conn.send(b'\r\n') |
26-Jul-2017 10:51:53 | low_conn.send(b'0\r\n\r\n') |
26-Jul-2017 10:51:53 | |
26-Jul-2017 10:51:53 | # Receive the response from the server |
26-Jul-2017 10:51:53 | try: |
26-Jul-2017 10:51:53 | # For Python 2.7+ versions, use buffering of HTTP |
26-Jul-2017 10:51:53 | # responses |
26-Jul-2017 10:51:53 | r = low_conn.getresponse(buffering=True) |
26-Jul-2017 10:51:53 | except TypeError: |
26-Jul-2017 10:51:53 | # For compatibility with Python 2.6 versions and back |
26-Jul-2017 10:51:53 | r = low_conn.getresponse() |
26-Jul-2017 10:51:53 | |
26-Jul-2017 10:51:53 | resp = HTTPResponse.from_httplib( |
26-Jul-2017 10:51:53 | r, |
26-Jul-2017 10:51:53 | pool=conn, |
26-Jul-2017 10:51:53 | connection=low_conn, |
26-Jul-2017 10:51:53 | preload_content=False, |
26-Jul-2017 10:51:53 | decode_content=False |
26-Jul-2017 10:51:53 | ) |
26-Jul-2017 10:51:53 | except: |
26-Jul-2017 10:51:53 | # If we hit any problems here, clean up the connection. |
26-Jul-2017 10:51:53 | # Then, reraise so that we can handle the actual exception. |
26-Jul-2017 10:51:53 | low_conn.close() |
26-Jul-2017 10:51:53 | raise |
26-Jul-2017 10:51:53 | |
26-Jul-2017 10:51:53 | except (ProtocolError, socket.error) as err: |
26-Jul-2017 10:51:53 | raise ConnectionError(err, request=request) |
26-Jul-2017 10:51:53 | |
26-Jul-2017 10:51:53 | except MaxRetryError as e: |
26-Jul-2017 10:51:53 | if isinstance(e.reason, ConnectTimeoutError): |
26-Jul-2017 10:51:53 | # TODO: Remove this in 3.0.0: see #2811 |
26-Jul-2017 10:51:53 | if not isinstance(e.reason, NewConnectionError): |
26-Jul-2017 10:51:53 | raise ConnectTimeout(e, request=request) |
26-Jul-2017 10:51:53 | |
26-Jul-2017 10:51:53 | if isinstance(e.reason, ResponseError): |
26-Jul-2017 10:51:53 | raise RetryError(e, request=request) |
26-Jul-2017 10:51:53 | |
26-Jul-2017 10:51:53 | if isinstance(e.reason, _ProxyError): |
26-Jul-2017 10:51:53 | raise ProxyError(e, request=request) |
26-Jul-2017 10:51:53 | |
26-Jul-2017 10:51:53 | > raise ConnectionError(e, request=request) |
26-Jul-2017 10:51:53 | E ConnectionError: HTTPConnectionPool(host='localhost', port=9000): Max retries exceeded with url: /clowder/api/version (Caused by NewConnectionError('<requests.packages.urllib3.connection.HTTPConnection object at 0x7fb5fea1c2d0>: Failed to establish a new connection: [Errno 111] Connection refused',)) |
26-Jul-2017 10:51:53 | |
26-Jul-2017 10:51:53 | /tmp/virtualenv/pyclowder2/local/lib/python2.7/site-packages/requests/adapters.py:467: ConnectionError |
26-Jul-2017 10:51:53 | --------------------------------- Captured log --------------------------------- |
26-Jul-2017 10:51:53 | client.py 45 DEBUG GET http://localhost:9000/clowder/api/version |
26-Jul-2017 10:51:53 | connectionpool.py 213 INFO Starting new HTTP connection (1): localhost |
26-Jul-2017 10:51:53 | _______________________________ test_get_sensors _______________________________ |
26-Jul-2017 10:51:53 | |
26-Jul-2017 10:51:53 | caplog = <pytest_capturelog.CaptureLogFuncArg object at 0x7fb5fecdd810> |
26-Jul-2017 10:51:53 | host = 'http://localhost:9000/clowder', key = 'r1ek3rs' |
26-Jul-2017 10:51:53 | |
26-Jul-2017 10:51:53 | def test_get_sensors(caplog, host, key): |
26-Jul-2017 10:51:53 | caplog.setLevel(logging.DEBUG) |
26-Jul-2017 10:51:53 | client = SensorsApi(host=host, key=key) |
26-Jul-2017 10:51:53 | response = client.sensors_get() |
26-Jul-2017 10:51:53 | > sensors = response.json() |
26-Jul-2017 10:51:53 | E AttributeError: 'NoneType' object has no attribute 'json' |
26-Jul-2017 10:51:53 | |
26-Jul-2017 10:51:53 | tests/test_geostreams.py:19: AttributeError |
26-Jul-2017 10:51:53 | --------------------------------- Captured log --------------------------------- |
26-Jul-2017 10:51:53 | sensors.py 27 DEBUG Getting all sensors |
26-Jul-2017 10:51:53 | connectionpool.py 213 INFO Starting new HTTP connection (1): localhost |
26-Jul-2017 10:51:53 | sensors.py 31 ERROR Error retrieving sensor list: not enough arguments for format string |
26-Jul-2017 10:51:53 | ____________________________ test_raise_for_status _____________________________ |
26-Jul-2017 10:51:53 | |
26-Jul-2017 10:51:53 | caplog = <pytest_capturelog.CaptureLogFuncArg object at 0x7fb5fe9f51d0> |
26-Jul-2017 10:51:53 | host = 'http://localhost:9000/clowder', key = 'r1ek3rs' |
26-Jul-2017 10:51:53 | |
26-Jul-2017 10:51:53 | def test_raise_for_status(caplog, host, key): |
26-Jul-2017 10:51:53 | client = ClowderClient(host=host, key=key) |
26-Jul-2017 10:51:53 | try: |
26-Jul-2017 10:51:53 | > client.get_json("this_path_does_not_exist") |
26-Jul-2017 10:51:53 | |
26-Jul-2017 10:51:53 | tests/test_geostreams.py:27: |
26-Jul-2017 10:51:53 | _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ |
26-Jul-2017 10:51:53 | pyclowder/client.py:68: in get_json |
26-Jul-2017 10:51:53 | r = requests.get(url, headers=self.headers) |
26-Jul-2017 10:51:53 | /tmp/virtualenv/pyclowder2/local/lib/python2.7/site-packages/requests/api.py:71: in get |
26-Jul-2017 10:51:53 | return request('get', url, params=params, **kwargs) |
26-Jul-2017 10:51:53 | /tmp/virtualenv/pyclowder2/local/lib/python2.7/site-packages/requests/api.py:57: in request |
26-Jul-2017 10:51:53 | return session.request(method=method, url=url, **kwargs) |
26-Jul-2017 10:51:53 | /tmp/virtualenv/pyclowder2/local/lib/python2.7/site-packages/requests/sessions.py:475: in request |
26-Jul-2017 10:51:53 | resp = self.send(prep, **send_kwargs) |
26-Jul-2017 10:51:53 | /tmp/virtualenv/pyclowder2/local/lib/python2.7/site-packages/requests/sessions.py:585: in send |
26-Jul-2017 10:51:53 | r = adapter.send(request, **kwargs) |
26-Jul-2017 10:51:53 | _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ |
26-Jul-2017 10:51:53 | |
26-Jul-2017 10:51:53 | self = <requests.adapters.HTTPAdapter object at 0x7fb5fe9f55d0> |
26-Jul-2017 10:51:53 | request = <PreparedRequest [GET]>, stream = False |
26-Jul-2017 10:51:53 | timeout = <requests.packages.urllib3.util.timeout.Timeout object at 0x7fb5fe9f5c10> |
26-Jul-2017 10:51:53 | verify = True, cert = None, proxies = OrderedDict() |
26-Jul-2017 10:51:53 | |
26-Jul-2017 10:51:53 | def send(self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None): |
26-Jul-2017 10:51:53 | """Sends PreparedRequest object. Returns Response object. |
26-Jul-2017 10:51:53 | |
26-Jul-2017 10:51:53 | :param request: The :class:`PreparedRequest <PreparedRequest>` being sent. |
26-Jul-2017 10:51:53 | :param stream: (optional) Whether to stream the request content. |
26-Jul-2017 10:51:53 | :param timeout: (optional) How long to wait for the server to send |
26-Jul-2017 10:51:53 | data before giving up, as a float, or a :ref:`(connect timeout, |
26-Jul-2017 10:51:53 | read timeout) <timeouts>` tuple. |
26-Jul-2017 10:51:53 | :type timeout: float or tuple |
26-Jul-2017 10:51:53 | :param verify: (optional) Whether to verify SSL certificates. |
26-Jul-2017 10:51:53 | :param cert: (optional) Any user-provided SSL certificate to be trusted. |
26-Jul-2017 10:51:53 | :param proxies: (optional) The proxies dictionary to apply to the request. |
26-Jul-2017 10:51:53 | """ |
26-Jul-2017 10:51:53 | |
26-Jul-2017 10:51:53 | conn = self.get_connection(request.url, proxies) |
26-Jul-2017 10:51:53 | |
26-Jul-2017 10:51:53 | self.cert_verify(conn, request.url, verify, cert) |
26-Jul-2017 10:51:53 | url = self.request_url(request, proxies) |
26-Jul-2017 10:51:53 | self.add_headers(request) |
26-Jul-2017 10:51:53 | |
26-Jul-2017 10:51:53 | chunked = not (request.body is None or 'Content-Length' in request.headers) |
26-Jul-2017 10:51:53 | |
26-Jul-2017 10:51:53 | if isinstance(timeout, tuple): |
26-Jul-2017 10:51:53 | try: |
26-Jul-2017 10:51:53 | connect, read = timeout |
26-Jul-2017 10:51:53 | timeout = TimeoutSauce(connect=connect, read=read) |
26-Jul-2017 10:51:53 | except ValueError as e: |
26-Jul-2017 10:51:53 | # this may raise a string formatting error. |
26-Jul-2017 10:51:53 | err = ("Invalid timeout {0}. Pass a (connect, read) " |
26-Jul-2017 10:51:53 | "timeout tuple, or a single float to set " |
26-Jul-2017 10:51:53 | "both timeouts to the same value".format(timeout)) |
26-Jul-2017 10:51:53 | raise ValueError(err) |
26-Jul-2017 10:51:53 | else: |
26-Jul-2017 10:51:53 | timeout = TimeoutSauce(connect=timeout, read=timeout) |
26-Jul-2017 10:51:53 | |
26-Jul-2017 10:51:53 | try: |
26-Jul-2017 10:51:53 | if not chunked: |
26-Jul-2017 10:51:53 | resp = conn.urlopen( |
26-Jul-2017 10:51:53 | method=request.method, |
26-Jul-2017 10:51:53 | url=url, |
26-Jul-2017 10:51:53 | body=request.body, |
26-Jul-2017 10:51:53 | headers=request.headers, |
26-Jul-2017 10:51:53 | redirect=False, |
26-Jul-2017 10:51:53 | assert_same_host=False, |
26-Jul-2017 10:51:53 | preload_content=False, |
26-Jul-2017 10:51:53 | decode_content=False, |
26-Jul-2017 10:51:53 | retries=self.max_retries, |
26-Jul-2017 10:51:53 | timeout=timeout |
26-Jul-2017 10:51:53 | ) |
26-Jul-2017 10:51:53 | |
26-Jul-2017 10:51:53 | # Send the request. |
26-Jul-2017 10:51:53 | else: |
26-Jul-2017 10:51:53 | if hasattr(conn, 'proxy_pool'): |
26-Jul-2017 10:51:53 | conn = conn.proxy_pool |
26-Jul-2017 10:51:53 | |
26-Jul-2017 10:51:53 | low_conn = conn._get_conn(timeout=DEFAULT_POOL_TIMEOUT) |
26-Jul-2017 10:51:53 | |
26-Jul-2017 10:51:53 | try: |
26-Jul-2017 10:51:53 | low_conn.putrequest(request.method, |
26-Jul-2017 10:51:53 | url, |
26-Jul-2017 10:51:53 | skip_accept_encoding=True) |
26-Jul-2017 10:51:53 | |
26-Jul-2017 10:51:53 | for header, value in request.headers.items(): |
26-Jul-2017 10:51:53 | low_conn.putheader(header, value) |
26-Jul-2017 10:51:53 | |
26-Jul-2017 10:51:53 | low_conn.endheaders() |
26-Jul-2017 10:51:53 | |
26-Jul-2017 10:51:53 | for i in request.body: |
26-Jul-2017 10:51:53 | low_conn.send(hex(len(i))[2:].encode('utf-8')) |
26-Jul-2017 10:51:53 | low_conn.send(b'\r\n') |
26-Jul-2017 10:51:53 | low_conn.send(i) |
26-Jul-2017 10:51:53 | low_conn.send(b'\r\n') |
26-Jul-2017 10:51:53 | low_conn.send(b'0\r\n\r\n') |
26-Jul-2017 10:51:53 | |
26-Jul-2017 10:51:53 | # Receive the response from the server |
26-Jul-2017 10:51:53 | try: |
26-Jul-2017 10:51:53 | # For Python 2.7+ versions, use buffering of HTTP |
26-Jul-2017 10:51:53 | # responses |
26-Jul-2017 10:51:53 | r = low_conn.getresponse(buffering=True) |
26-Jul-2017 10:51:53 | except TypeError: |
26-Jul-2017 10:51:53 | # For compatibility with Python 2.6 versions and back |
26-Jul-2017 10:51:53 | r = low_conn.getresponse() |
26-Jul-2017 10:51:53 | |
26-Jul-2017 10:51:53 | resp = HTTPResponse.from_httplib( |
26-Jul-2017 10:51:53 | r, |
26-Jul-2017 10:51:53 | pool=conn, |
26-Jul-2017 10:51:53 | connection=low_conn, |
26-Jul-2017 10:51:53 | preload_content=False, |
26-Jul-2017 10:51:53 | decode_content=False |
26-Jul-2017 10:51:53 | ) |
26-Jul-2017 10:51:53 | except: |
26-Jul-2017 10:51:53 | # If we hit any problems here, clean up the connection. |
26-Jul-2017 10:51:53 | # Then, reraise so that we can handle the actual exception. |
26-Jul-2017 10:51:53 | low_conn.close() |
26-Jul-2017 10:51:53 | raise |
26-Jul-2017 10:51:53 | |
26-Jul-2017 10:51:53 | except (ProtocolError, socket.error) as err: |
26-Jul-2017 10:51:53 | raise ConnectionError(err, request=request) |
26-Jul-2017 10:51:53 | |
26-Jul-2017 10:51:53 | except MaxRetryError as e: |
26-Jul-2017 10:51:53 | if isinstance(e.reason, ConnectTimeoutError): |
26-Jul-2017 10:51:53 | # TODO: Remove this in 3.0.0: see #2811 |
26-Jul-2017 10:51:53 | if not isinstance(e.reason, NewConnectionError): |
26-Jul-2017 10:51:53 | raise ConnectTimeout(e, request=request) |
26-Jul-2017 10:51:53 | |
26-Jul-2017 10:51:53 | if isinstance(e.reason, ResponseError): |
26-Jul-2017 10:51:53 | raise RetryError(e, request=request) |
26-Jul-2017 10:51:53 | |
26-Jul-2017 10:51:53 | if isinstance(e.reason, _ProxyError): |
26-Jul-2017 10:51:53 | raise ProxyError(e, request=request) |
26-Jul-2017 10:51:53 | |
26-Jul-2017 10:51:53 | > raise ConnectionError(e, request=request) |
26-Jul-2017 10:51:53 | E ConnectionError: HTTPConnectionPool(host='localhost', port=9000): Max retries exceeded with url: /clowder/apithis_path_does_not_exist (Caused by NewConnectionError('<requests.packages.urllib3.connection.HTTPConnection object at 0x7fb5fe9f5d10>: Failed to establish a new connection: [Errno 111] Connection refused',)) |
26-Jul-2017 10:51:53 | |
26-Jul-2017 10:51:53 | /tmp/virtualenv/pyclowder2/local/lib/python2.7/site-packages/requests/adapters.py:467: ConnectionError |
26-Jul-2017 10:51:53 | --------------------------------- Captured log --------------------------------- |
26-Jul-2017 10:51:53 | connectionpool.py 213 INFO Starting new HTTP connection (1): localhost |
26-Jul-2017 10:51:53 | ______________________________ test_sensors_post _______________________________ |
26-Jul-2017 10:51:53 | |
26-Jul-2017 10:51:53 | caplog = <pytest_capturelog.CaptureLogFuncArg object at 0x7fb5fea1cf50> |
26-Jul-2017 10:51:53 | host = 'http://localhost:9000/clowder', key = 'r1ek3rs' |
26-Jul-2017 10:51:53 | |
26-Jul-2017 10:51:53 | def test_sensors_post(caplog, host, key): |
26-Jul-2017 10:51:53 | global sensor_id |
26-Jul-2017 10:51:53 | caplog.setLevel(logging.DEBUG) |
26-Jul-2017 10:51:53 | client = SensorsApi(host=host, key=key) |
26-Jul-2017 10:51:53 | sensor_json = client.sensor_create_json("Test Sensor", 40.1149202, -88.2270582, 0, "", "ER") |
26-Jul-2017 10:51:53 | response = client.sensor_post(sensor_json) |
26-Jul-2017 10:51:53 | > body = response.json() |
26-Jul-2017 10:51:53 | E AttributeError: 'NoneType' object has no attribute 'json' |
26-Jul-2017 10:51:53 | |
26-Jul-2017 10:51:53 | tests/test_sensors.py:13: AttributeError |
26-Jul-2017 10:51:53 | --------------------------------- Captured log --------------------------------- |
26-Jul-2017 10:51:53 | sensors.py 67 DEBUG Adding sensor |
26-Jul-2017 10:51:53 | connectionpool.py 213 INFO Starting new HTTP connection (1): localhost |
26-Jul-2017 10:51:53 | client.py 147 ERROR POST http://localhost:9000/clowder/api/geostreams/sensors: HTTPConnectionPool(host='localhost', port=9000): Max retries exceeded with url: /clowder/api/geostreams/sensors?key=r1ek3rs (Caused by NewConnectionError('<requests.packages.urllib3.connection.HTTPConnection object at 0x7fb5fe967610>: Failed to establish a new connection: [Errno 111] Connection refused',)) |
26-Jul-2017 10:51:53 | _______________________________ test_sensors_get _______________________________ |
26-Jul-2017 10:51:53 | |
26-Jul-2017 10:51:53 | caplog = <pytest_capturelog.CaptureLogFuncArg object at 0x7fb5fe967650> |
26-Jul-2017 10:51:53 | host = 'http://localhost:9000/clowder', key = 'r1ek3rs' |
26-Jul-2017 10:51:53 | |
26-Jul-2017 10:51:53 | def test_sensors_get(caplog, host, key): |
26-Jul-2017 10:51:53 | global sensor_id |
26-Jul-2017 10:51:53 | caplog.setLevel(logging.DEBUG) |
26-Jul-2017 10:51:53 | client = SensorsApi(host=host, key=key) |
26-Jul-2017 10:51:53 | response = client.sensor_get(sensor_id) |
26-Jul-2017 10:51:53 | > sensor = response.json() |
26-Jul-2017 10:51:53 | E AttributeError: 'NoneType' object has no attribute 'json' |
26-Jul-2017 10:51:53 | |
26-Jul-2017 10:51:53 | tests/test_sensors.py:24: AttributeError |
26-Jul-2017 10:51:53 | --------------------------------- Captured log --------------------------------- |
26-Jul-2017 10:51:53 | sensors.py 40 DEBUG Getting sensor |
26-Jul-2017 10:51:53 | connectionpool.py 213 INFO Starting new HTTP connection (1): localhost |
26-Jul-2017 10:51:53 | sensors.py 44 ERROR Error retrieving sensor : not enough arguments for format string |
26-Jul-2017 10:51:53 | _____________________________ test_sensors_delete ______________________________ |
26-Jul-2017 10:51:53 | |
26-Jul-2017 10:51:53 | caplog = <pytest_capturelog.CaptureLogFuncArg object at 0x7fb5fea1c650> |
26-Jul-2017 10:51:53 | host = 'http://localhost:9000/clowder', key = 'r1ek3rs' |
26-Jul-2017 10:51:53 | |
26-Jul-2017 10:51:53 | def test_sensors_delete(caplog, host, key): |
26-Jul-2017 10:51:53 | global sensor_id |
26-Jul-2017 10:51:53 | caplog.setLevel(logging.DEBUG) |
26-Jul-2017 10:51:53 | client = SensorsApi(host=host, key=key) |
26-Jul-2017 10:51:53 | response = client.sensor_delete(sensor_id) |
26-Jul-2017 10:51:53 | > sensor = response.json() |
26-Jul-2017 10:51:53 | E AttributeError: 'NoneType' object has no attribute 'json' |
26-Jul-2017 10:51:53 | |
26-Jul-2017 10:51:53 | tests/test_sensors.py:34: AttributeError |
26-Jul-2017 10:51:53 | --------------------------------- Captured log --------------------------------- |
26-Jul-2017 10:51:53 | sensors.py 101 DEBUG Deleting sensor |
26-Jul-2017 10:51:53 | connectionpool.py 213 INFO Starting new HTTP connection (1): localhost |
26-Jul-2017 10:51:53 | client.py 211 ERROR DELETE http://localhost:9000/clowder/api/geostreams/sensors/: HTTPConnectionPool(host='localhost', port=9000): Max retries exceeded with url: /clowder/api/geostreams/sensors/?key=r1ek3rs (Caused by NewConnectionError('<requests.packages.urllib3.connection.HTTPConnection object at 0x7fb5fe9e3890>: Failed to establish a new connection: [Errno 111] Connection refused',)) |
26-Jul-2017 10:51:53 | __________________________________ PEP8-check __________________________________ |
26-Jul-2017 10:51:53 | /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC245-JOB1/tests/test_streams.py:8:1: E302 expected 2 blank lines, found 1 |
26-Jul-2017 10:51:53 | def test_streams_post(caplog, host, key): |
26-Jul-2017 10:51:53 | ^ |
26-Jul-2017 10:51:53 | /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC245-JOB1/tests/test_streams.py:20:24: W291 trailing whitespace |
26-Jul-2017 10:51:53 | assert "id" in body |
26-Jul-2017 10:51:53 | ^ |
26-Jul-2017 10:51:53 | /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC245-JOB1/tests/test_streams.py:39:54: W291 trailing whitespace |
26-Jul-2017 10:51:53 | response = stream_client.stream_delete(stream_id) |
26-Jul-2017 10:51:53 | ^ |
26-Jul-2017 10:51:53 | |
26-Jul-2017 10:51:53 | ______________________________ test_streams_post _______________________________ |
26-Jul-2017 10:51:53 | |
26-Jul-2017 10:51:53 | caplog = <pytest_capturelog.CaptureLogFuncArg object at 0x7fb5fe9e31d0> |
26-Jul-2017 10:51:53 | host = 'http://localhost:9000/clowder', key = 'r1ek3rs' |
26-Jul-2017 10:51:53 | |
26-Jul-2017 10:51:53 | def test_streams_post(caplog, host, key): |
26-Jul-2017 10:51:53 | global sensor_id, stream_id |
26-Jul-2017 10:51:53 | caplog.setLevel(logging.DEBUG) |
26-Jul-2017 10:51:53 | sensor_client = SensorsApi(host=host, key=key) |
26-Jul-2017 10:51:53 | sensor_json = sensor_client.sensor_create_json("Test Sensor", 40.1149202, -88.2270582, 0, "", "ER") |
26-Jul-2017 10:51:53 | sensor_body = sensor_client.sensor_post_json(sensor_json) |
26-Jul-2017 10:51:53 | > sensor_id = sensor_body['id'] |
26-Jul-2017 10:51:53 | E TypeError: 'NoneType' object has no attribute '__getitem__' |
26-Jul-2017 10:51:53 | |
26-Jul-2017 10:51:53 | tests/test_streams.py:14: TypeError |
26-Jul-2017 10:51:53 | --------------------------------- Captured log --------------------------------- |
26-Jul-2017 10:51:53 | sensors.py 80 DEBUG Adding or getting sensor |
26-Jul-2017 10:51:53 | sensors.py 53 DEBUG Getting sensor Test Sensor |
26-Jul-2017 10:51:53 | connectionpool.py 213 INFO Starting new HTTP connection (1): localhost |
26-Jul-2017 10:51:53 | sensors.py 57 ERROR Error retrieving sensor Test Sensor: not enough arguments for format string |
26-Jul-2017 10:51:53 | sensors.py 92 ERROR Error adding sensor Test Sensor: 'NoneType' object has no attribute 'json' |
26-Jul-2017 10:51:53 | _______________________________ test_streams_get _______________________________ |
26-Jul-2017 10:51:53 | |
26-Jul-2017 10:51:53 | caplog = <pytest_capturelog.CaptureLogFuncArg object at 0x7fb5fea200d0> |
26-Jul-2017 10:51:53 | host = 'http://localhost:9000/clowder', key = 'r1ek3rs' |
26-Jul-2017 10:51:53 | |
26-Jul-2017 10:51:53 | def test_streams_get(caplog, host, key): |
26-Jul-2017 10:51:53 | global sensor_id, stream_id |
26-Jul-2017 10:51:53 | caplog.setLevel(logging.DEBUG) |
26-Jul-2017 10:51:53 | stream_client = StreamsApi(host=host, key=key) |
26-Jul-2017 10:51:53 | > stream = stream_client.stream_get_by_name("Test Sensor") |
26-Jul-2017 10:51:53 | E AttributeError: 'StreamsApi' object has no attribute 'stream_get_by_name' |
26-Jul-2017 10:51:53 | |
26-Jul-2017 10:51:53 | tests/test_streams.py:27: AttributeError |
26-Jul-2017 10:51:53 | _____________________________ test_streams_delete ______________________________ |
26-Jul-2017 10:51:53 | |
26-Jul-2017 10:51:53 | caplog = <pytest_capturelog.CaptureLogFuncArg object at 0x7fb5fea0a3d0> |
26-Jul-2017 10:51:53 | host = 'http://localhost:9000/clowder', key = 'r1ek3rs' |
26-Jul-2017 10:51:53 | |
26-Jul-2017 10:51:53 | def test_streams_delete(caplog, host, key): |
26-Jul-2017 10:51:53 | global sensor_id, stream_id |
26-Jul-2017 10:51:53 | caplog.setLevel(logging.DEBUG) |
26-Jul-2017 10:51:53 | sensor_client = SensorsApi(host=host, key=key) |
26-Jul-2017 10:51:53 | response = sensor_client.sensor_delete(sensor_id) |
26-Jul-2017 10:51:53 | |
26-Jul-2017 10:51:53 | stream_client = StreamsApi(host=host, key=key) |
26-Jul-2017 10:51:53 | response = stream_client.stream_delete(stream_id) |
26-Jul-2017 10:51:53 | |
26-Jul-2017 10:51:53 | > stream = response.json() |
26-Jul-2017 10:51:53 | E AttributeError: 'NoneType' object has no attribute 'json' |
26-Jul-2017 10:51:53 | |
26-Jul-2017 10:51:53 | tests/test_streams.py:41: AttributeError |
26-Jul-2017 10:51:53 | --------------------------------- Captured log --------------------------------- |
26-Jul-2017 10:51:53 | sensors.py 101 DEBUG Deleting sensor |
26-Jul-2017 10:51:53 | connectionpool.py 213 INFO Starting new HTTP connection (1): localhost |
26-Jul-2017 10:51:53 | client.py 211 ERROR DELETE http://localhost:9000/clowder/api/geostreams/sensors/: HTTPConnectionPool(host='localhost', port=9000): Max retries exceeded with url: /clowder/api/geostreams/sensors/?key=r1ek3rs (Caused by NewConnectionError('<requests.packages.urllib3.connection.HTTPConnection object at 0x7fb5fe9e3ed0>: Failed to establish a new connection: [Errno 111] Connection refused',)) |
26-Jul-2017 10:51:53 | streams.py 90 DEBUG Deleting stream |
26-Jul-2017 10:51:53 | connectionpool.py 213 INFO Starting new HTTP connection (1): localhost |
26-Jul-2017 10:51:53 | client.py 211 ERROR DELETE http://localhost:9000/clowder/api/geostreams/streams/: HTTPConnectionPool(host='localhost', port=9000): Max retries exceeded with url: /clowder/api/geostreams/streams/?key=r1ek3rs (Caused by NewConnectionError('<requests.packages.urllib3.connection.HTTPConnection object at 0x7fb5fe967750>: Failed to establish a new connection: [Errno 111] Connection refused',)) |
26-Jul-2017 10:51:53 | ============================ pytest-warning summary ============================ |
26-Jul-2017 10:51:53 | WI1 /tmp/virtualenv/pyclowder2/local/lib/python2.7/site-packages/pytest_capturelog.py:171 'pytest_runtest_makereport' hook uses deprecated __multicall__ argument |
26-Jul-2017 10:51:53 | WC1 None pytest_funcarg__caplog: declaring fixtures using "pytest_funcarg__" prefix is deprecated and scheduled to be removed in pytest 4.0. Please remove the prefix and use the @pytest.fixture decorator instead. |
26-Jul-2017 10:51:53 | WC1 None pytest_funcarg__capturelog: declaring fixtures using "pytest_funcarg__" prefix is deprecated and scheduled to be removed in pytest 4.0. Please remove the prefix and use the @pytest.fixture decorator instead. |
26-Jul-2017 10:51:53 | =========== 19 failed, 20 passed, 3 pytest-warnings in 0.60 seconds ============ |
26-Jul-2017 10:51:53 | Failing task since return code of [/home/bamboo/bamboo-agent-home/temp/CATS-PYC245-JOB1-7-ScriptBuildTask-7594916351186564715.sh] was 1 while expected 0 |
26-Jul-2017 10:51:53 | Finished task 'pytest' with result: Failed |
26-Jul-2017 10:51:53 | Starting task 'test results' of type 'com.atlassian.bamboo.plugins.testresultparser:task.testresultparser.junit' |
26-Jul-2017 10:51:53 | Parsing test results under /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC245-JOB1... |
26-Jul-2017 10:51:53 | Failing task since 19 failing test cases were found. |
26-Jul-2017 10:51:53 | Finished task 'test results' with result: Failed |
26-Jul-2017 10:51:53 | Running post build plugin 'Docker Container Cleanup' |
26-Jul-2017 10:51:53 | Running post build plugin 'NCover Results Collector' |
26-Jul-2017 10:51:53 | Running post build plugin 'Clover Results Collector' |
26-Jul-2017 10:51:53 | Running post build plugin 'npm Cache Cleanup' |
26-Jul-2017 10:51:53 | Running post build plugin 'Artifact Copier' |
26-Jul-2017 10:51:53 | Finalising the build... |
26-Jul-2017 10:51:53 | Stopping timer. |
26-Jul-2017 10:51:53 | Build CATS-PYC245-JOB1-7 completed. |
26-Jul-2017 10:51:53 | Running on server: post build plugin 'NCover Results Collector' |
26-Jul-2017 10:51:53 | Running on server: post build plugin 'Build Hanging Detection Configuration' |
26-Jul-2017 10:51:53 | Running on server: post build plugin 'Clover Delta Calculator' |
26-Jul-2017 10:51:53 | Running on server: post build plugin 'Maven Dependencies Postprocessor' |
26-Jul-2017 10:51:53 | All post build plugins have finished |
26-Jul-2017 10:51:53 | Generating build results summary... |
26-Jul-2017 10:51:53 | Saving build results to disk... |
26-Jul-2017 10:51:53 | Logging substituted variables... |
26-Jul-2017 10:51:53 | Indexing build results... |
26-Jul-2017 10:51:53 | Finished building CATS-PYC245-JOB1-7. |