simple 11-Aug-2017 16:09:38 Build Clowder - pyclowder2 - geostreams - Default Job #27 (CATS-PYC26-JOB1-27) started building on agent buildserver-3.os.ncsa.edu simple 11-Aug-2017 16:09:38 Remote agent on host buildserver-3.os.ncsa.edu simple 11-Aug-2017 16:09:38 Build working directory is /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC26-JOB1 simple 11-Aug-2017 16:09:38 Executing build Clowder - pyclowder2 - geostreams - Default Job #27 (CATS-PYC26-JOB1-27) simple 11-Aug-2017 16:09:38 Starting task 'Checkout Default Repository' of type 'com.atlassian.bamboo.plugins.vcs:task.vcs.checkout' simple 11-Aug-2017 16:09:38 Updating source code to revision: e2026b1974297b550adbbb55bb3f5f6119c91bab simple 11-Aug-2017 16:09:38 Creating local git repository in '/home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC26-JOB1/.git'. simple 11-Aug-2017 16:09:38 Initialized empty Git repository in /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC26-JOB1/.git/ simple 11-Aug-2017 16:09:38 Fetching 'refs/heads/geostreams' from 'ssh://git@opensource.ncsa.illinois.edu:7999/cats/pyclowder2.git'. Will try to do a shallow fetch. simple 11-Aug-2017 16:09:38 Warning: Permanently added '[127.0.0.1]:37712' (RSA) to the list of known hosts. simple 11-Aug-2017 16:09:38 From ssh://127.0.0.1:37712/cats/pyclowder2 simple 11-Aug-2017 16:09:38 * [new branch] geostreams -> geostreams simple 11-Aug-2017 16:09:38 Checking out revision e2026b1974297b550adbbb55bb3f5f6119c91bab. simple 11-Aug-2017 16:09:38 Switched to branch 'geostreams' simple 11-Aug-2017 16:09:38 Updated source code to revision: e2026b1974297b550adbbb55bb3f5f6119c91bab simple 11-Aug-2017 16:09:38 Finished task 'Checkout Default Repository' with result: Success simple 11-Aug-2017 16:09:38 Running pre-build action: VCS Version Collector simple 11-Aug-2017 16:09:38 Starting task 'pytest' of type 'com.atlassian.bamboo.plugins.scripttask:task.builder.script' command 11-Aug-2017 16:09:38 Beginning to execute external process for build 'Clowder - pyclowder2 - geostreams - Default Job #27 (CATS-PYC26-JOB1-27)'\n ... running command line: \n/home/bamboo/bamboo-agent-home/temp/CATS-PYC26-JOB1-27-ScriptBuildTask-5556430487952047889.sh\n ... in: /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC26-JOB1\n ... using extra environment variables: \nbamboo_planRepository_1_branch=geostreams\nbamboo_capability_system_builder_node_Node_js_v6_9_1=/home/bamboo/node-v6.9.1/bin/node\nbamboo_capability_system_builder_command_npm_6=/home/bamboo/node-v6.9.1/bin/npm\nbamboo_capability_system_builder_command_buckminster_4_3=/home/bamboo/buckminster-4.3/buckminster\nbamboo_capability_system_builder_command_buckminster_4_2=/home/bamboo/buckminster-4.2/buckminster\nbamboo_planRepository_1_branchDisplayName=geostreams\nbamboo_repository_revision_number=e2026b1974297b550adbbb55bb3f5f6119c91bab\nbamboo_resultsUrl=https://opensource.ncsa.illinois.edu/bamboo/browse/CATS-PYC26-JOB1-27\nbamboo_repository_127172662_previous_revision_number=7cf00c3e9dbb5a6c53ae5aeed1f6887b963d3111\nbamboo_capability_system_builder_command_sphinx=/usr/bin/sphinx-build\nbamboo_planRepository_1_name=pyclowder2\nbamboo_repository_127172662_branch_name=geostreams\nbamboo_build_working_directory=/home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC26-JOB1\nbamboo_buildKey=CATS-PYC26-JOB1\nbamboo_capability_system_os=linux\nbamboo_repository_127172662_git_branch=geostreams\nbamboo_shortPlanName=geostreams\nbamboo_repository_127172662_git_repositoryUrl=ssh://git@opensource.ncsa.illinois.edu:7999/cats/pyclowder2.git\nbamboo_repository_127172662_revision_number=e2026b1974297b550adbbb55bb3f5f6119c91bab\nbamboo_planRepository_name=pyclowder2\nbamboo_buildNumber=27\nbamboo_repository_127172662_name=pyclowder2\nbamboo_shortJobName=Default Job\nbamboo_buildResultsUrl=https://opensource.ncsa.illinois.edu/bamboo/browse/CATS-PYC26-JOB1-27\nbamboo_planRepository_repositoryUrl=ssh://git@opensource.ncsa.illinois.edu:7999/cats/pyclowder2.git\nbamboo_agentId=143032321\nbamboo_planName=Clowder - pyclowder2 - geostreams\nbamboo_shortPlanKey=PYC26\nbamboo_capability_system_builder_command_sbt_0_13=/home/bamboo/sbt-0.13.2/bin/sbt\nbamboo_shortJobKey=JOB1\nbamboo_capability_system_builder_node_Node_js_v0_10_28=/home/bamboo/node-v0.10.28/bin/node\nbamboo_planRepository_revision=e2026b1974297b550adbbb55bb3f5f6119c91bab\nbamboo_repository_previous_revision_number=7cf00c3e9dbb5a6c53ae5aeed1f6887b963d3111\nbamboo_buildTimeStamp=2017-08-11T16:09:37.731-05:00\nbamboo_capability_system_builder_command_npm=/home/bamboo/node-v0.10.28/bin/npm\nbamboo_planRepository_previousRevision=7cf00c3e9dbb5a6c53ae5aeed1f6887b963d3111\nbamboo_capability_system_builder_mvn2_Maven_2=/home/bamboo/apache-maven-2.2.1\nbamboo_buildResultKey=CATS-PYC26-JOB1-27\nbamboo_repository_git_branch=geostreams\nbamboo_repository_branch_name=geostreams\nbamboo_buildPlanName=Clowder - pyclowder2 - geostreams - Default Job\nbamboo_planRepository_1_revision=e2026b1974297b550adbbb55bb3f5f6119c91bab\nbamboo_capability_system_builder_command_python3=/usr/bin/python3\nbamboo_repository_name=pyclowder2\nbamboo_repository_127172662_git_username=\nbamboo_buildFailed=false\nbamboo_capability_system_docker_executable=/usr/bin/docker\nbamboo_capability_system_builder_command_grunt=/home/bamboo/node-v0.10.28/bin/grunt\nbamboo_planRepository_branch=geostreams\nbamboo_agentWorkingDirectory=/home/bamboo/bamboo-agent-home/xml-data/build-dir\nbamboo_capability_system_git_executable=/usr/bin/git\nbamboo_planRepository_1_previousRevision=7cf00c3e9dbb5a6c53ae5aeed1f6887b963d3111\nbamboo_repository_git_username=\nbamboo_capability_system_builder_sbt_SBT_0_13_13=/home/bamboo/sbt-0.13.13\nbamboo_planRepository_branchDisplayName=geostreams\nbamboo_capability_system_builder_command_phantomjs=/home/bamboo/phantomjs-1.9.8/bin/phantomjs\nbamboo_planRepository_1_type=bbserver\nbamboo_planRepository_branchName=geostreams\nbamboo_capability_system_builder_command_python2_7=/usr/bin/python2.7\nbamboo_capability_system_hostname=buildserver-1\nbamboo_capability_system_jdk_JDK=/home/bamboo/jdk1.8.0_66\nbamboo_capability_system_software_mongo=/usr/bin/mongo\nbamboo_plan_storageTag=plan-126976103\nbamboo_capability_system_software_rabbitmq=/usr/sbin/rabbitmqctl\nbamboo_capability_system_builder_command_casperjs=/home/bamboo/node-v0.10.28/bin/casperjs\nbamboo_planRepository_type=bbserver\nbamboo_planRepository_1_username=\nbamboo_capability_system_jdk_JDK_1_8_0_66=/home/bamboo/jdk1.8.0_66\nbamboo_repository_git_repositoryUrl=ssh://git@opensource.ncsa.illinois.edu:7999/cats/pyclowder2.git\nbamboo_capability_system_builder_node_Node_js=/home/bamboo/node-v0.10.28/bin/node\nbamboo_capability_system_builder_ant_Ant=/home/bamboo/apache-ant-1.9.4\nbamboo_capability_system_builder_mvn3_Maven_3=/home/bamboo/apache-maven-3.3.9\nbamboo_working_directory=/home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC26-JOB1\nbamboo_planKey=CATS-PYC26\nbamboo_planRepository_1_repositoryUrl=ssh://git@opensource.ncsa.illinois.edu:7999/cats/pyclowder2.git\nbamboo_planRepository_username=\nbamboo_capability_system_jdk_JDK_1_8=/home/bamboo/jdk1.8.0_66\nbamboo_capability_system_jdk_JDK_1_6=/home/bamboo/jdk1.6.0_41\nbamboo_capability_system_builder_command_mkcrx=/home/bamboo/mkcrx/mkcrx.sh\nbamboo_capability_system_jdk_JDK_1_7=/home/bamboo/jdk1.7.0_60\nbamboo_planRepository_1_branchName=geostreams\n build 11-Aug-2017 16:09:39 New python executable in /tmp/virtualenv/pyclowder2/bin/python2 build 11-Aug-2017 16:09:39 Also creating executable in /tmp/virtualenv/pyclowder2/bin/python build 11-Aug-2017 16:09:42 Installing setuptools, pkg_resources, pip, wheel...done. build 11-Aug-2017 16:09:42 Running virtualenv with interpreter /usr/bin/python2 build 11-Aug-2017 16:09:42 Collecting enum34==1.1.6 (from -r requirements.txt (line 1)) build 11-Aug-2017 16:09:42 Using cached enum34-1.1.6-py2-none-any.whl build 11-Aug-2017 16:09:42 Collecting et-xmlfile==1.0.1 (from -r requirements.txt (line 2)) build 11-Aug-2017 16:09:42 Collecting jdcal==1.3 (from -r requirements.txt (line 3)) build 11-Aug-2017 16:09:43 Collecting openpyxl==2.4.1 (from -r requirements.txt (line 4)) build 11-Aug-2017 16:09:43 Collecting pika==0.10.0 (from -r requirements.txt (line 5)) build 11-Aug-2017 16:09:43 Using cached pika-0.10.0-py2.py3-none-any.whl build 11-Aug-2017 16:09:43 Collecting python-dateutil==2.6.0 (from -r requirements.txt (line 6)) build 11-Aug-2017 16:09:43 Using cached python_dateutil-2.6.0-py2.py3-none-any.whl build 11-Aug-2017 16:09:43 Collecting pytz==2016.10 (from -r requirements.txt (line 7)) build 11-Aug-2017 16:09:43 Using cached pytz-2016.10-py2.py3-none-any.whl build 11-Aug-2017 16:09:43 Collecting PyYAML==3.11 (from -r requirements.txt (line 8)) build 11-Aug-2017 16:09:43 Collecting requests==2.10.0 (from -r requirements.txt (line 9)) build 11-Aug-2017 16:09:43 Using cached requests-2.10.0-py2.py3-none-any.whl build 11-Aug-2017 16:09:43 Collecting six==1.10.0 (from -r requirements.txt (line 10)) build 11-Aug-2017 16:09:43 Using cached six-1.10.0-py2.py3-none-any.whl build 11-Aug-2017 16:09:43 Collecting wheel==0.24.0 (from -r requirements.txt (line 11)) build 11-Aug-2017 16:09:44 Using cached wheel-0.24.0-py2.py3-none-any.whl build 11-Aug-2017 16:09:44 Collecting pytest==3.0.3 (from -r requirements.txt (line 12)) build 11-Aug-2017 16:09:44 Using cached pytest-3.0.3-py2.py3-none-any.whl build 11-Aug-2017 16:09:44 Collecting pytest-pep8==1.0.6 (from -r requirements.txt (line 13)) build 11-Aug-2017 16:09:44 Collecting pytest-capturelog==0.7 (from -r requirements.txt (line 14)) build 11-Aug-2017 16:09:44 Collecting urllib3==1.14 (from -r requirements.txt (line 15)) build 11-Aug-2017 16:09:44 Using cached urllib3-1.14-py2.py3-none-any.whl build 11-Aug-2017 16:09:44 Collecting py>=1.4.29 (from pytest==3.0.3->-r requirements.txt (line 12)) build 11-Aug-2017 16:09:44 Using cached py-1.4.34-py2.py3-none-any.whl build 11-Aug-2017 16:09:44 Collecting pep8>=1.3 (from pytest-pep8==1.0.6->-r requirements.txt (line 13)) build 11-Aug-2017 16:09:44 Using cached pep8-1.7.0-py2.py3-none-any.whl build 11-Aug-2017 16:09:44 Collecting pytest-cache (from pytest-pep8==1.0.6->-r requirements.txt (line 13)) build 11-Aug-2017 16:09:44 Collecting execnet>=1.1.dev1 (from pytest-cache->pytest-pep8==1.0.6->-r requirements.txt (line 13)) build 11-Aug-2017 16:09:44 Using cached execnet-1.4.1-py2.py3-none-any.whl build 11-Aug-2017 16:09:44 Collecting apipkg>=1.4 (from execnet>=1.1.dev1->pytest-cache->pytest-pep8==1.0.6->-r requirements.txt (line 13)) build 11-Aug-2017 16:09:44 Using cached apipkg-1.4-py2.py3-none-any.whl build 11-Aug-2017 16:09:44 Installing collected packages: enum34, et-xmlfile, jdcal, openpyxl, pika, six, python-dateutil, pytz, PyYAML, requests, wheel, py, pytest, pep8, apipkg, execnet, pytest-cache, pytest-pep8, pytest-capturelog, urllib3 build 11-Aug-2017 16:09:45 Found existing installation: wheel 0.30.0a0 build 11-Aug-2017 16:09:45 Uninstalling wheel-0.30.0a0: build 11-Aug-2017 16:09:45 Successfully uninstalled wheel-0.30.0a0 build 11-Aug-2017 16:09:45 Successfully installed PyYAML-3.11 apipkg-1.4 enum34-1.1.6 et-xmlfile-1.0.1 execnet-1.4.1 jdcal-1.3 openpyxl-2.4.1 pep8-1.7.0 pika-0.10.0 py-1.4.34 pytest-3.0.3 pytest-cache-1.0 pytest-capturelog-0.7 pytest-pep8-1.0.6 python-dateutil-2.6.0 pytz-2016.10 requests-2.10.0 six-1.10.0 urllib3-1.14 wheel-0.24.0 build 11-Aug-2017 16:09:46 ============================= test session starts ============================== build 11-Aug-2017 16:09:46 platform linux2 -- Python 2.7.12, pytest-3.0.3, py-1.4.34, pluggy-0.4.0 build 11-Aug-2017 16:09:46 rootdir: /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC26-JOB1, inifile: setup.cfg build 11-Aug-2017 16:09:46 plugins: pep8-1.0.6, capturelog-0.7 build 11-Aug-2017 16:09:46 collected 39 items build 11-Aug-2017 16:09:46 build 11-Aug-2017 16:09:46 setup.py . build 11-Aug-2017 16:09:46 docs/source/conf.py . build 11-Aug-2017 16:09:46 pyclowder/__init__.py . build 11-Aug-2017 16:09:46 pyclowder/client.py . build 11-Aug-2017 16:09:46 pyclowder/collections.py . build 11-Aug-2017 16:09:46 pyclowder/connectors.py . build 11-Aug-2017 16:09:46 pyclowder/datasets.py . build 11-Aug-2017 16:09:46 pyclowder/extractors.py . build 11-Aug-2017 16:09:46 pyclowder/files.py . build 11-Aug-2017 16:09:46 pyclowder/sections.py . build 11-Aug-2017 16:09:46 pyclowder/utils.py . build 11-Aug-2017 16:09:46 pyclowder/geostreams/__init__.py . build 11-Aug-2017 16:09:46 pyclowder/geostreams/cache.py . build 11-Aug-2017 16:09:46 pyclowder/geostreams/csv.py . build 11-Aug-2017 16:09:46 pyclowder/geostreams/datapoints.py . build 11-Aug-2017 16:09:46 pyclowder/geostreams/datasets.py . build 11-Aug-2017 16:09:46 pyclowder/geostreams/geocode_convert.py . build 11-Aug-2017 16:09:46 pyclowder/geostreams/map_names.py . build 11-Aug-2017 16:09:46 pyclowder/geostreams/sensors.py . build 11-Aug-2017 16:09:46 pyclowder/geostreams/streams.py . build 11-Aug-2017 16:09:46 pyclowder/geostreams/time_transformers.py . build 11-Aug-2017 16:09:46 sample-extractors/echo/echo.py . build 11-Aug-2017 16:09:46 sample-extractors/wordcount/wordcount.py . build 11-Aug-2017 16:09:46 tests/__init__.py . build 11-Aug-2017 16:09:46 tests/conftest.py . build 11-Aug-2017 16:09:46 tests/test_datapoints.py .F build 11-Aug-2017 16:09:46 tests/test_geostreams.py .FFF build 11-Aug-2017 16:09:46 tests/test_sensors.py .FFF build 11-Aug-2017 16:09:46 tests/test_streams.py .FFF build 11-Aug-2017 16:09:46 build 11-Aug-2017 16:09:46 generated xml file: /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC26-JOB1/test-reports/results.xml build 11-Aug-2017 16:09:46 =================================== FAILURES =================================== build 11-Aug-2017 16:09:46 _____________________ test_datapoints_count_by_sensor_get ______________________ build 11-Aug-2017 16:09:46 build 11-Aug-2017 16:09:46 caplog = build 11-Aug-2017 16:09:46 host = 'http://localhost:9000/clowder', key = 'r1ek3rs' build 11-Aug-2017 16:09:46 build 11-Aug-2017 16:09:46 def test_datapoints_count_by_sensor_get(caplog, host, key): build 11-Aug-2017 16:09:46 caplog.setLevel(logging.DEBUG) build 11-Aug-2017 16:09:46 client = DatapointsApi(host=host, key=key) build 11-Aug-2017 16:09:46 response = client.datapoints_count_by_sensor_get(950) build 11-Aug-2017 16:09:46 > sensors = response.text build 11-Aug-2017 16:09:46 E AttributeError: 'NoneType' object has no attribute 'text' build 11-Aug-2017 16:09:46 build 11-Aug-2017 16:09:46 tests/test_datapoints.py:10: AttributeError build 11-Aug-2017 16:09:46 --------------------------------- Captured log --------------------------------- build 11-Aug-2017 16:09:46 datapoints.py 45 DEBUG Counting datapoints by sensor build 11-Aug-2017 16:09:46 connectionpool.py 213 INFO Starting new HTTP connection (1): localhost build 11-Aug-2017 16:09:46 datapoints.py 49 ERROR Error counting datapoints by sensor 950: not enough arguments for format string build 11-Aug-2017 16:09:46 _________________________________ test_version _________________________________ build 11-Aug-2017 16:09:46 build 11-Aug-2017 16:09:46 caplog = build 11-Aug-2017 16:09:46 host = 'http://localhost:9000/clowder', key = 'r1ek3rs' build 11-Aug-2017 16:09:46 build 11-Aug-2017 16:09:46 def test_version(caplog, host, key): build 11-Aug-2017 16:09:46 caplog.setLevel(logging.DEBUG) build 11-Aug-2017 16:09:46 client = ClowderClient(host=host, key=key) build 11-Aug-2017 16:09:46 > version = client.version() build 11-Aug-2017 16:09:46 build 11-Aug-2017 16:09:46 tests/test_geostreams.py:12: build 11-Aug-2017 16:09:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ build 11-Aug-2017 16:09:46 pyclowder/client.py:47: in version build 11-Aug-2017 16:09:46 r = requests.get(url) build 11-Aug-2017 16:09:46 /tmp/virtualenv/pyclowder2/local/lib/python2.7/site-packages/requests/api.py:71: in get build 11-Aug-2017 16:09:46 return request('get', url, params=params, **kwargs) build 11-Aug-2017 16:09:46 /tmp/virtualenv/pyclowder2/local/lib/python2.7/site-packages/requests/api.py:57: in request build 11-Aug-2017 16:09:46 return session.request(method=method, url=url, **kwargs) build 11-Aug-2017 16:09:46 /tmp/virtualenv/pyclowder2/local/lib/python2.7/site-packages/requests/sessions.py:475: in request build 11-Aug-2017 16:09:46 resp = self.send(prep, **send_kwargs) build 11-Aug-2017 16:09:46 /tmp/virtualenv/pyclowder2/local/lib/python2.7/site-packages/requests/sessions.py:585: in send build 11-Aug-2017 16:09:46 r = adapter.send(request, **kwargs) build 11-Aug-2017 16:09:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ build 11-Aug-2017 16:09:46 build 11-Aug-2017 16:09:46 self = build 11-Aug-2017 16:09:46 request = , stream = False build 11-Aug-2017 16:09:46 timeout = build 11-Aug-2017 16:09:46 verify = True, cert = None, proxies = OrderedDict() build 11-Aug-2017 16:09:46 build 11-Aug-2017 16:09:46 def send(self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None): build 11-Aug-2017 16:09:46 """Sends PreparedRequest object. Returns Response object. build 11-Aug-2017 16:09:46 build 11-Aug-2017 16:09:46 :param request: The :class:`PreparedRequest ` being sent. build 11-Aug-2017 16:09:46 :param stream: (optional) Whether to stream the request content. build 11-Aug-2017 16:09:46 :param timeout: (optional) How long to wait for the server to send build 11-Aug-2017 16:09:46 data before giving up, as a float, or a :ref:`(connect timeout, build 11-Aug-2017 16:09:46 read timeout) ` tuple. build 11-Aug-2017 16:09:46 :type timeout: float or tuple build 11-Aug-2017 16:09:46 :param verify: (optional) Whether to verify SSL certificates. build 11-Aug-2017 16:09:46 :param cert: (optional) Any user-provided SSL certificate to be trusted. build 11-Aug-2017 16:09:46 :param proxies: (optional) The proxies dictionary to apply to the request. build 11-Aug-2017 16:09:46 """ build 11-Aug-2017 16:09:46 build 11-Aug-2017 16:09:46 conn = self.get_connection(request.url, proxies) build 11-Aug-2017 16:09:46 build 11-Aug-2017 16:09:46 self.cert_verify(conn, request.url, verify, cert) build 11-Aug-2017 16:09:46 url = self.request_url(request, proxies) build 11-Aug-2017 16:09:46 self.add_headers(request) build 11-Aug-2017 16:09:46 build 11-Aug-2017 16:09:46 chunked = not (request.body is None or 'Content-Length' in request.headers) build 11-Aug-2017 16:09:46 build 11-Aug-2017 16:09:46 if isinstance(timeout, tuple): build 11-Aug-2017 16:09:46 try: build 11-Aug-2017 16:09:46 connect, read = timeout build 11-Aug-2017 16:09:46 timeout = TimeoutSauce(connect=connect, read=read) build 11-Aug-2017 16:09:46 except ValueError as e: build 11-Aug-2017 16:09:46 # this may raise a string formatting error. build 11-Aug-2017 16:09:46 err = ("Invalid timeout {0}. Pass a (connect, read) " build 11-Aug-2017 16:09:46 "timeout tuple, or a single float to set " build 11-Aug-2017 16:09:46 "both timeouts to the same value".format(timeout)) build 11-Aug-2017 16:09:46 raise ValueError(err) build 11-Aug-2017 16:09:46 else: build 11-Aug-2017 16:09:46 timeout = TimeoutSauce(connect=timeout, read=timeout) build 11-Aug-2017 16:09:46 build 11-Aug-2017 16:09:46 try: build 11-Aug-2017 16:09:46 if not chunked: build 11-Aug-2017 16:09:46 resp = conn.urlopen( build 11-Aug-2017 16:09:46 method=request.method, build 11-Aug-2017 16:09:46 url=url, build 11-Aug-2017 16:09:46 body=request.body, build 11-Aug-2017 16:09:46 headers=request.headers, build 11-Aug-2017 16:09:46 redirect=False, build 11-Aug-2017 16:09:46 assert_same_host=False, build 11-Aug-2017 16:09:46 preload_content=False, build 11-Aug-2017 16:09:46 decode_content=False, build 11-Aug-2017 16:09:46 retries=self.max_retries, build 11-Aug-2017 16:09:46 timeout=timeout build 11-Aug-2017 16:09:46 ) build 11-Aug-2017 16:09:46 build 11-Aug-2017 16:09:46 # Send the request. build 11-Aug-2017 16:09:46 else: build 11-Aug-2017 16:09:46 if hasattr(conn, 'proxy_pool'): build 11-Aug-2017 16:09:46 conn = conn.proxy_pool build 11-Aug-2017 16:09:46 build 11-Aug-2017 16:09:46 low_conn = conn._get_conn(timeout=DEFAULT_POOL_TIMEOUT) build 11-Aug-2017 16:09:46 build 11-Aug-2017 16:09:46 try: build 11-Aug-2017 16:09:46 low_conn.putrequest(request.method, build 11-Aug-2017 16:09:46 url, build 11-Aug-2017 16:09:46 skip_accept_encoding=True) build 11-Aug-2017 16:09:46 build 11-Aug-2017 16:09:46 for header, value in request.headers.items(): build 11-Aug-2017 16:09:46 low_conn.putheader(header, value) build 11-Aug-2017 16:09:46 build 11-Aug-2017 16:09:46 low_conn.endheaders() build 11-Aug-2017 16:09:46 build 11-Aug-2017 16:09:46 for i in request.body: build 11-Aug-2017 16:09:46 low_conn.send(hex(len(i))[2:].encode('utf-8')) build 11-Aug-2017 16:09:46 low_conn.send(b'\r\n') build 11-Aug-2017 16:09:46 low_conn.send(i) build 11-Aug-2017 16:09:46 low_conn.send(b'\r\n') build 11-Aug-2017 16:09:46 low_conn.send(b'0\r\n\r\n') build 11-Aug-2017 16:09:46 build 11-Aug-2017 16:09:46 # Receive the response from the server build 11-Aug-2017 16:09:46 try: build 11-Aug-2017 16:09:46 # For Python 2.7+ versions, use buffering of HTTP build 11-Aug-2017 16:09:46 # responses build 11-Aug-2017 16:09:46 r = low_conn.getresponse(buffering=True) build 11-Aug-2017 16:09:46 except TypeError: build 11-Aug-2017 16:09:46 # For compatibility with Python 2.6 versions and back build 11-Aug-2017 16:09:46 r = low_conn.getresponse() build 11-Aug-2017 16:09:46 build 11-Aug-2017 16:09:46 resp = HTTPResponse.from_httplib( build 11-Aug-2017 16:09:46 r, build 11-Aug-2017 16:09:46 pool=conn, build 11-Aug-2017 16:09:46 connection=low_conn, build 11-Aug-2017 16:09:46 preload_content=False, build 11-Aug-2017 16:09:46 decode_content=False build 11-Aug-2017 16:09:46 ) build 11-Aug-2017 16:09:46 except: build 11-Aug-2017 16:09:46 # If we hit any problems here, clean up the connection. build 11-Aug-2017 16:09:46 # Then, reraise so that we can handle the actual exception. build 11-Aug-2017 16:09:46 low_conn.close() build 11-Aug-2017 16:09:46 raise build 11-Aug-2017 16:09:46 build 11-Aug-2017 16:09:46 except (ProtocolError, socket.error) as err: build 11-Aug-2017 16:09:46 raise ConnectionError(err, request=request) build 11-Aug-2017 16:09:46 build 11-Aug-2017 16:09:46 except MaxRetryError as e: build 11-Aug-2017 16:09:46 if isinstance(e.reason, ConnectTimeoutError): build 11-Aug-2017 16:09:46 # TODO: Remove this in 3.0.0: see #2811 build 11-Aug-2017 16:09:46 if not isinstance(e.reason, NewConnectionError): build 11-Aug-2017 16:09:46 raise ConnectTimeout(e, request=request) build 11-Aug-2017 16:09:46 build 11-Aug-2017 16:09:46 if isinstance(e.reason, ResponseError): build 11-Aug-2017 16:09:46 raise RetryError(e, request=request) build 11-Aug-2017 16:09:46 build 11-Aug-2017 16:09:46 if isinstance(e.reason, _ProxyError): build 11-Aug-2017 16:09:46 raise ProxyError(e, request=request) build 11-Aug-2017 16:09:46 build 11-Aug-2017 16:09:46 > raise ConnectionError(e, request=request) build 11-Aug-2017 16:09:46 E ConnectionError: HTTPConnectionPool(host='localhost', port=9000): Max retries exceeded with url: /clowder/api/version (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused',)) build 11-Aug-2017 16:09:46 build 11-Aug-2017 16:09:46 /tmp/virtualenv/pyclowder2/local/lib/python2.7/site-packages/requests/adapters.py:467: ConnectionError build 11-Aug-2017 16:09:46 --------------------------------- Captured log --------------------------------- build 11-Aug-2017 16:09:46 client.py 46 DEBUG GET http://localhost:9000/clowder/api/version build 11-Aug-2017 16:09:46 connectionpool.py 213 INFO Starting new HTTP connection (1): localhost build 11-Aug-2017 16:09:46 _______________________________ test_get_sensors _______________________________ build 11-Aug-2017 16:09:46 build 11-Aug-2017 16:09:46 caplog = build 11-Aug-2017 16:09:46 host = 'http://localhost:9000/clowder', key = 'r1ek3rs' build 11-Aug-2017 16:09:46 build 11-Aug-2017 16:09:46 def test_get_sensors(caplog, host, key): build 11-Aug-2017 16:09:46 caplog.setLevel(logging.DEBUG) build 11-Aug-2017 16:09:46 client = SensorsApi(host=host, key=key) build 11-Aug-2017 16:09:46 response = client.sensors_get() build 11-Aug-2017 16:09:46 > sensors = response.json() build 11-Aug-2017 16:09:46 E AttributeError: 'NoneType' object has no attribute 'json' build 11-Aug-2017 16:09:46 build 11-Aug-2017 16:09:46 tests/test_geostreams.py:21: AttributeError build 11-Aug-2017 16:09:46 --------------------------------- Captured log --------------------------------- build 11-Aug-2017 16:09:46 sensors.py 30 DEBUG Getting all sensors build 11-Aug-2017 16:09:46 connectionpool.py 213 INFO Starting new HTTP connection (1): localhost build 11-Aug-2017 16:09:46 sensors.py 34 ERROR Error retrieving sensor list: not enough arguments for format string build 11-Aug-2017 16:09:46 ____________________________ test_raise_for_status _____________________________ build 11-Aug-2017 16:09:46 build 11-Aug-2017 16:09:46 caplog = build 11-Aug-2017 16:09:46 host = 'http://localhost:9000/clowder', key = 'r1ek3rs' build 11-Aug-2017 16:09:46 build 11-Aug-2017 16:09:46 def test_raise_for_status(caplog, host, key): build 11-Aug-2017 16:09:46 client = ClowderClient(host=host, key=key) build 11-Aug-2017 16:09:46 try: build 11-Aug-2017 16:09:46 > client.get_json("this_path_does_not_exist") build 11-Aug-2017 16:09:46 build 11-Aug-2017 16:09:46 tests/test_geostreams.py:29: build 11-Aug-2017 16:09:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ build 11-Aug-2017 16:09:46 pyclowder/client.py:69: in get_json build 11-Aug-2017 16:09:46 r = requests.get(url, headers=self.headers) build 11-Aug-2017 16:09:46 /tmp/virtualenv/pyclowder2/local/lib/python2.7/site-packages/requests/api.py:71: in get build 11-Aug-2017 16:09:46 return request('get', url, params=params, **kwargs) build 11-Aug-2017 16:09:46 /tmp/virtualenv/pyclowder2/local/lib/python2.7/site-packages/requests/api.py:57: in request build 11-Aug-2017 16:09:46 return session.request(method=method, url=url, **kwargs) build 11-Aug-2017 16:09:46 /tmp/virtualenv/pyclowder2/local/lib/python2.7/site-packages/requests/sessions.py:475: in request build 11-Aug-2017 16:09:46 resp = self.send(prep, **send_kwargs) build 11-Aug-2017 16:09:46 /tmp/virtualenv/pyclowder2/local/lib/python2.7/site-packages/requests/sessions.py:585: in send build 11-Aug-2017 16:09:46 r = adapter.send(request, **kwargs) build 11-Aug-2017 16:09:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ build 11-Aug-2017 16:09:46 build 11-Aug-2017 16:09:46 self = build 11-Aug-2017 16:09:46 request = , stream = False build 11-Aug-2017 16:09:46 timeout = build 11-Aug-2017 16:09:46 verify = True, cert = None, proxies = OrderedDict() build 11-Aug-2017 16:09:46 build 11-Aug-2017 16:09:46 def send(self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None): build 11-Aug-2017 16:09:46 """Sends PreparedRequest object. Returns Response object. build 11-Aug-2017 16:09:46 build 11-Aug-2017 16:09:46 :param request: The :class:`PreparedRequest ` being sent. build 11-Aug-2017 16:09:46 :param stream: (optional) Whether to stream the request content. build 11-Aug-2017 16:09:46 :param timeout: (optional) How long to wait for the server to send build 11-Aug-2017 16:09:46 data before giving up, as a float, or a :ref:`(connect timeout, build 11-Aug-2017 16:09:46 read timeout) ` tuple. build 11-Aug-2017 16:09:46 :type timeout: float or tuple build 11-Aug-2017 16:09:46 :param verify: (optional) Whether to verify SSL certificates. build 11-Aug-2017 16:09:46 :param cert: (optional) Any user-provided SSL certificate to be trusted. build 11-Aug-2017 16:09:46 :param proxies: (optional) The proxies dictionary to apply to the request. build 11-Aug-2017 16:09:46 """ build 11-Aug-2017 16:09:46 build 11-Aug-2017 16:09:46 conn = self.get_connection(request.url, proxies) build 11-Aug-2017 16:09:46 build 11-Aug-2017 16:09:46 self.cert_verify(conn, request.url, verify, cert) build 11-Aug-2017 16:09:46 url = self.request_url(request, proxies) build 11-Aug-2017 16:09:46 self.add_headers(request) build 11-Aug-2017 16:09:46 build 11-Aug-2017 16:09:46 chunked = not (request.body is None or 'Content-Length' in request.headers) build 11-Aug-2017 16:09:46 build 11-Aug-2017 16:09:46 if isinstance(timeout, tuple): build 11-Aug-2017 16:09:46 try: build 11-Aug-2017 16:09:46 connect, read = timeout build 11-Aug-2017 16:09:46 timeout = TimeoutSauce(connect=connect, read=read) build 11-Aug-2017 16:09:46 except ValueError as e: build 11-Aug-2017 16:09:46 # this may raise a string formatting error. build 11-Aug-2017 16:09:46 err = ("Invalid timeout {0}. Pass a (connect, read) " build 11-Aug-2017 16:09:46 "timeout tuple, or a single float to set " build 11-Aug-2017 16:09:46 "both timeouts to the same value".format(timeout)) build 11-Aug-2017 16:09:46 raise ValueError(err) build 11-Aug-2017 16:09:46 else: build 11-Aug-2017 16:09:46 timeout = TimeoutSauce(connect=timeout, read=timeout) build 11-Aug-2017 16:09:46 build 11-Aug-2017 16:09:46 try: build 11-Aug-2017 16:09:46 if not chunked: build 11-Aug-2017 16:09:46 resp = conn.urlopen( build 11-Aug-2017 16:09:46 method=request.method, build 11-Aug-2017 16:09:46 url=url, build 11-Aug-2017 16:09:46 body=request.body, build 11-Aug-2017 16:09:46 headers=request.headers, build 11-Aug-2017 16:09:46 redirect=False, build 11-Aug-2017 16:09:46 assert_same_host=False, build 11-Aug-2017 16:09:46 preload_content=False, build 11-Aug-2017 16:09:46 decode_content=False, build 11-Aug-2017 16:09:46 retries=self.max_retries, build 11-Aug-2017 16:09:46 timeout=timeout build 11-Aug-2017 16:09:46 ) build 11-Aug-2017 16:09:46 build 11-Aug-2017 16:09:46 # Send the request. build 11-Aug-2017 16:09:46 else: build 11-Aug-2017 16:09:46 if hasattr(conn, 'proxy_pool'): build 11-Aug-2017 16:09:46 conn = conn.proxy_pool build 11-Aug-2017 16:09:46 build 11-Aug-2017 16:09:46 low_conn = conn._get_conn(timeout=DEFAULT_POOL_TIMEOUT) build 11-Aug-2017 16:09:46 build 11-Aug-2017 16:09:46 try: build 11-Aug-2017 16:09:46 low_conn.putrequest(request.method, build 11-Aug-2017 16:09:46 url, build 11-Aug-2017 16:09:46 skip_accept_encoding=True) build 11-Aug-2017 16:09:46 build 11-Aug-2017 16:09:46 for header, value in request.headers.items(): build 11-Aug-2017 16:09:46 low_conn.putheader(header, value) build 11-Aug-2017 16:09:46 build 11-Aug-2017 16:09:46 low_conn.endheaders() build 11-Aug-2017 16:09:46 build 11-Aug-2017 16:09:46 for i in request.body: build 11-Aug-2017 16:09:46 low_conn.send(hex(len(i))[2:].encode('utf-8')) build 11-Aug-2017 16:09:46 low_conn.send(b'\r\n') build 11-Aug-2017 16:09:46 low_conn.send(i) build 11-Aug-2017 16:09:46 low_conn.send(b'\r\n') build 11-Aug-2017 16:09:46 low_conn.send(b'0\r\n\r\n') build 11-Aug-2017 16:09:46 build 11-Aug-2017 16:09:46 # Receive the response from the server build 11-Aug-2017 16:09:46 try: build 11-Aug-2017 16:09:46 # For Python 2.7+ versions, use buffering of HTTP build 11-Aug-2017 16:09:46 # responses build 11-Aug-2017 16:09:46 r = low_conn.getresponse(buffering=True) build 11-Aug-2017 16:09:46 except TypeError: build 11-Aug-2017 16:09:46 # For compatibility with Python 2.6 versions and back build 11-Aug-2017 16:09:46 r = low_conn.getresponse() build 11-Aug-2017 16:09:46 build 11-Aug-2017 16:09:46 resp = HTTPResponse.from_httplib( build 11-Aug-2017 16:09:46 r, build 11-Aug-2017 16:09:46 pool=conn, build 11-Aug-2017 16:09:46 connection=low_conn, build 11-Aug-2017 16:09:46 preload_content=False, build 11-Aug-2017 16:09:46 decode_content=False build 11-Aug-2017 16:09:46 ) build 11-Aug-2017 16:09:46 except: build 11-Aug-2017 16:09:46 # If we hit any problems here, clean up the connection. build 11-Aug-2017 16:09:46 # Then, reraise so that we can handle the actual exception. build 11-Aug-2017 16:09:46 low_conn.close() build 11-Aug-2017 16:09:46 raise build 11-Aug-2017 16:09:46 build 11-Aug-2017 16:09:46 except (ProtocolError, socket.error) as err: build 11-Aug-2017 16:09:46 raise ConnectionError(err, request=request) build 11-Aug-2017 16:09:46 build 11-Aug-2017 16:09:46 except MaxRetryError as e: build 11-Aug-2017 16:09:46 if isinstance(e.reason, ConnectTimeoutError): build 11-Aug-2017 16:09:46 # TODO: Remove this in 3.0.0: see #2811 build 11-Aug-2017 16:09:46 if not isinstance(e.reason, NewConnectionError): build 11-Aug-2017 16:09:46 raise ConnectTimeout(e, request=request) build 11-Aug-2017 16:09:46 build 11-Aug-2017 16:09:46 if isinstance(e.reason, ResponseError): build 11-Aug-2017 16:09:46 raise RetryError(e, request=request) build 11-Aug-2017 16:09:46 build 11-Aug-2017 16:09:46 if isinstance(e.reason, _ProxyError): build 11-Aug-2017 16:09:46 raise ProxyError(e, request=request) build 11-Aug-2017 16:09:46 build 11-Aug-2017 16:09:46 > raise ConnectionError(e, request=request) build 11-Aug-2017 16:09:46 E ConnectionError: HTTPConnectionPool(host='localhost', port=9000): Max retries exceeded with url: /clowder/apithis_path_does_not_exist (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused',)) build 11-Aug-2017 16:09:46 build 11-Aug-2017 16:09:46 /tmp/virtualenv/pyclowder2/local/lib/python2.7/site-packages/requests/adapters.py:467: ConnectionError build 11-Aug-2017 16:09:46 --------------------------------- Captured log --------------------------------- build 11-Aug-2017 16:09:46 connectionpool.py 213 INFO Starting new HTTP connection (1): localhost build 11-Aug-2017 16:09:46 ______________________________ test_sensors_post _______________________________ build 11-Aug-2017 16:09:46 build 11-Aug-2017 16:09:46 caplog = build 11-Aug-2017 16:09:46 host = 'http://localhost:9000/clowder', key = 'r1ek3rs' build 11-Aug-2017 16:09:46 build 11-Aug-2017 16:09:46 def test_sensors_post(caplog, host, key): build 11-Aug-2017 16:09:46 global sensor_id build 11-Aug-2017 16:09:46 caplog.setLevel(logging.DEBUG) build 11-Aug-2017 16:09:46 client = SensorsApi(host=host, key=key) build 11-Aug-2017 16:09:46 sensor_json = client.sensor_create_json("Test Sensor", 40.1149202, -88.2270582, 0, "", "ER") build 11-Aug-2017 16:09:46 response = client.sensor_post(sensor_json) build 11-Aug-2017 16:09:46 > body = response.json() build 11-Aug-2017 16:09:46 E AttributeError: 'NoneType' object has no attribute 'json' build 11-Aug-2017 16:09:46 build 11-Aug-2017 16:09:46 tests/test_sensors.py:14: AttributeError build 11-Aug-2017 16:09:46 --------------------------------- Captured log --------------------------------- build 11-Aug-2017 16:09:46 sensors.py 70 DEBUG Adding sensor build 11-Aug-2017 16:09:46 connectionpool.py 213 INFO Starting new HTTP connection (1): localhost build 11-Aug-2017 16:09:46 client.py 149 ERROR POST http://localhost:9000/clowder/api/geostreams/sensors: HTTPConnectionPool(host='localhost', port=9000): Max retries exceeded with url: /clowder/api/geostreams/sensors?key=r1ek3rs (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused',)) build 11-Aug-2017 16:09:46 _______________________________ test_sensors_get _______________________________ build 11-Aug-2017 16:09:46 build 11-Aug-2017 16:09:46 caplog = build 11-Aug-2017 16:09:46 host = 'http://localhost:9000/clowder', key = 'r1ek3rs' build 11-Aug-2017 16:09:46 build 11-Aug-2017 16:09:46 def test_sensors_get(caplog, host, key): build 11-Aug-2017 16:09:46 global sensor_id build 11-Aug-2017 16:09:46 caplog.setLevel(logging.DEBUG) build 11-Aug-2017 16:09:46 client = SensorsApi(host=host, key=key) build 11-Aug-2017 16:09:46 response = client.sensor_get(sensor_id) build 11-Aug-2017 16:09:46 > sensor = response.json() build 11-Aug-2017 16:09:46 E AttributeError: 'NoneType' object has no attribute 'json' build 11-Aug-2017 16:09:46 build 11-Aug-2017 16:09:46 tests/test_sensors.py:25: AttributeError build 11-Aug-2017 16:09:46 --------------------------------- Captured log --------------------------------- build 11-Aug-2017 16:09:46 sensors.py 43 DEBUG Getting sensor build 11-Aug-2017 16:09:46 connectionpool.py 213 INFO Starting new HTTP connection (1): localhost build 11-Aug-2017 16:09:46 sensors.py 47 ERROR Error retrieving sensor : not enough arguments for format string build 11-Aug-2017 16:09:46 _____________________________ test_sensors_delete ______________________________ build 11-Aug-2017 16:09:46 build 11-Aug-2017 16:09:46 caplog = build 11-Aug-2017 16:09:46 host = 'http://localhost:9000/clowder', key = 'r1ek3rs' build 11-Aug-2017 16:09:46 build 11-Aug-2017 16:09:46 def test_sensors_delete(caplog, host, key): build 11-Aug-2017 16:09:46 global sensor_id build 11-Aug-2017 16:09:46 caplog.setLevel(logging.DEBUG) build 11-Aug-2017 16:09:46 client = SensorsApi(host=host, key=key) build 11-Aug-2017 16:09:46 response = client.sensor_delete(sensor_id) build 11-Aug-2017 16:09:46 > sensor = response.json() build 11-Aug-2017 16:09:46 E AttributeError: 'NoneType' object has no attribute 'json' build 11-Aug-2017 16:09:46 build 11-Aug-2017 16:09:46 tests/test_sensors.py:35: AttributeError build 11-Aug-2017 16:09:46 --------------------------------- Captured log --------------------------------- build 11-Aug-2017 16:09:46 sensors.py 104 DEBUG Deleting sensor build 11-Aug-2017 16:09:46 connectionpool.py 213 INFO Starting new HTTP connection (1): localhost build 11-Aug-2017 16:09:46 client.py 214 ERROR DELETE http://localhost:9000/clowder/api/geostreams/sensors/: HTTPConnectionPool(host='localhost', port=9000): Max retries exceeded with url: /clowder/api/geostreams/sensors/?key=r1ek3rs (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused',)) build 11-Aug-2017 16:09:46 ______________________________ test_streams_post _______________________________ build 11-Aug-2017 16:09:46 build 11-Aug-2017 16:09:46 caplog = build 11-Aug-2017 16:09:46 host = 'http://localhost:9000/clowder', key = 'r1ek3rs' build 11-Aug-2017 16:09:46 build 11-Aug-2017 16:09:46 def test_streams_post(caplog, host, key): build 11-Aug-2017 16:09:46 global sensor_id, stream_id build 11-Aug-2017 16:09:46 caplog.setLevel(logging.DEBUG) build 11-Aug-2017 16:09:46 sensor_client = SensorsApi(host=host, key=key) build 11-Aug-2017 16:09:46 sensor_json = sensor_client.sensor_create_json("Test Sensor", 40.1149202, -88.2270582, 0, "", "ER") build 11-Aug-2017 16:09:46 sensor_body = sensor_client.sensor_post_json(sensor_json) build 11-Aug-2017 16:09:46 > sensor_id = sensor_body['id'] build 11-Aug-2017 16:09:46 E TypeError: 'NoneType' object has no attribute '__getitem__' build 11-Aug-2017 16:09:46 build 11-Aug-2017 16:09:46 tests/test_streams.py:16: TypeError build 11-Aug-2017 16:09:46 --------------------------------- Captured log --------------------------------- build 11-Aug-2017 16:09:46 sensors.py 83 DEBUG Adding or getting sensor build 11-Aug-2017 16:09:46 sensors.py 56 DEBUG Getting sensor Test Sensor build 11-Aug-2017 16:09:46 connectionpool.py 213 INFO Starting new HTTP connection (1): localhost build 11-Aug-2017 16:09:46 sensors.py 60 ERROR Error retrieving sensor Test Sensor: not enough arguments for format string build 11-Aug-2017 16:09:46 sensors.py 95 ERROR Error adding sensor Test Sensor: 'NoneType' object has no attribute 'json' build 11-Aug-2017 16:09:46 _______________________________ test_streams_get _______________________________ build 11-Aug-2017 16:09:46 build 11-Aug-2017 16:09:46 caplog = build 11-Aug-2017 16:09:46 host = 'http://localhost:9000/clowder', key = 'r1ek3rs' build 11-Aug-2017 16:09:46 build 11-Aug-2017 16:09:46 def test_streams_get(caplog, host, key): build 11-Aug-2017 16:09:46 global sensor_id, stream_id build 11-Aug-2017 16:09:46 caplog.setLevel(logging.DEBUG) build 11-Aug-2017 16:09:46 stream_client = StreamsApi(host=host, key=key) build 11-Aug-2017 16:09:46 > stream = stream_client.stream_get_by_name("Test Sensor") build 11-Aug-2017 16:09:46 E AttributeError: 'StreamsApi' object has no attribute 'stream_get_by_name' build 11-Aug-2017 16:09:46 build 11-Aug-2017 16:09:46 tests/test_streams.py:29: AttributeError build 11-Aug-2017 16:09:46 _____________________________ test_streams_delete ______________________________ build 11-Aug-2017 16:09:46 build 11-Aug-2017 16:09:46 caplog = build 11-Aug-2017 16:09:46 host = 'http://localhost:9000/clowder', key = 'r1ek3rs' build 11-Aug-2017 16:09:46 build 11-Aug-2017 16:09:46 def test_streams_delete(caplog, host, key): build 11-Aug-2017 16:09:46 global sensor_id, stream_id build 11-Aug-2017 16:09:46 caplog.setLevel(logging.DEBUG) build 11-Aug-2017 16:09:46 sensor_client = SensorsApi(host=host, key=key) build 11-Aug-2017 16:09:46 response = sensor_client.sensor_delete(sensor_id) build 11-Aug-2017 16:09:46 build 11-Aug-2017 16:09:46 stream_client = StreamsApi(host=host, key=key) build 11-Aug-2017 16:09:46 response = stream_client.stream_delete(stream_id) build 11-Aug-2017 16:09:46 build 11-Aug-2017 16:09:46 > stream = response.json() build 11-Aug-2017 16:09:46 E AttributeError: 'NoneType' object has no attribute 'json' build 11-Aug-2017 16:09:46 build 11-Aug-2017 16:09:46 tests/test_streams.py:43: AttributeError build 11-Aug-2017 16:09:46 --------------------------------- Captured log --------------------------------- build 11-Aug-2017 16:09:46 sensors.py 104 DEBUG Deleting sensor build 11-Aug-2017 16:09:46 connectionpool.py 213 INFO Starting new HTTP connection (1): localhost build 11-Aug-2017 16:09:46 client.py 214 ERROR DELETE http://localhost:9000/clowder/api/geostreams/sensors/: HTTPConnectionPool(host='localhost', port=9000): Max retries exceeded with url: /clowder/api/geostreams/sensors/?key=r1ek3rs (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused',)) build 11-Aug-2017 16:09:46 streams.py 91 DEBUG Deleting stream build 11-Aug-2017 16:09:46 connectionpool.py 213 INFO Starting new HTTP connection (1): localhost build 11-Aug-2017 16:09:46 client.py 214 ERROR DELETE http://localhost:9000/clowder/api/geostreams/streams/: HTTPConnectionPool(host='localhost', port=9000): Max retries exceeded with url: /clowder/api/geostreams/streams/?key=r1ek3rs (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused',)) build 11-Aug-2017 16:09:46 ============================ pytest-warning summary ============================ build 11-Aug-2017 16:09:46 WI1 /tmp/virtualenv/pyclowder2/local/lib/python2.7/site-packages/pytest_capturelog.py:171 'pytest_runtest_makereport' hook uses deprecated __multicall__ argument build 11-Aug-2017 16:09:46 WC1 None pytest_funcarg__caplog: declaring fixtures using "pytest_funcarg__" prefix is deprecated and scheduled to be removed in pytest 4.0. Please remove the prefix and use the @pytest.fixture decorator instead. build 11-Aug-2017 16:09:46 WC1 None pytest_funcarg__capturelog: declaring fixtures using "pytest_funcarg__" prefix is deprecated and scheduled to be removed in pytest 4.0. Please remove the prefix and use the @pytest.fixture decorator instead. build 11-Aug-2017 16:09:46 =========== 10 failed, 29 passed, 3 pytest-warnings in 0.58 seconds ============ simple 11-Aug-2017 16:09:46 Failing task since return code of [/home/bamboo/bamboo-agent-home/temp/CATS-PYC26-JOB1-27-ScriptBuildTask-5556430487952047889.sh] was 1 while expected 0 simple 11-Aug-2017 16:09:46 Finished task 'pytest' with result: Failed simple 11-Aug-2017 16:09:46 Starting task 'test results' of type 'com.atlassian.bamboo.plugins.testresultparser:task.testresultparser.junit' simple 11-Aug-2017 16:09:46 Parsing test results under /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC26-JOB1... simple 11-Aug-2017 16:09:46 Failing task since 10 failing test cases were found. simple 11-Aug-2017 16:09:46 Finished task 'test results' with result: Failed simple 11-Aug-2017 16:09:46 Running post build plugin 'Docker Container Cleanup' simple 11-Aug-2017 16:09:46 Running post build plugin 'NCover Results Collector' simple 11-Aug-2017 16:09:46 Running post build plugin 'Clover Results Collector' simple 11-Aug-2017 16:09:46 Running post build plugin 'npm Cache Cleanup' simple 11-Aug-2017 16:09:46 Running post build plugin 'Artifact Copier' simple 11-Aug-2017 16:09:46 Finalising the build... simple 11-Aug-2017 16:09:46 Stopping timer. simple 11-Aug-2017 16:09:46 Build CATS-PYC26-JOB1-27 completed. simple 11-Aug-2017 16:09:46 Running on server: post build plugin 'NCover Results Collector' simple 11-Aug-2017 16:09:46 Running on server: post build plugin 'Build Hanging Detection Configuration' simple 11-Aug-2017 16:09:46 Running on server: post build plugin 'Clover Delta Calculator' simple 11-Aug-2017 16:09:46 Running on server: post build plugin 'Maven Dependencies Postprocessor' simple 11-Aug-2017 16:09:46 All post build plugins have finished simple 11-Aug-2017 16:09:46 Generating build results summary... simple 11-Aug-2017 16:09:47 Saving build results to disk... simple 11-Aug-2017 16:09:47 Logging substituted variables... simple 11-Aug-2017 16:09:47 Indexing build results... simple 11-Aug-2017 16:09:47 Finished building CATS-PYC26-JOB1-27.