Next Generation of pyClowder

Build: #27 failed

Job: Default Job failed

Stages & jobs

  1. Default Stage

Build log

The build generated 569 lines of output. Download or view full build log.

11-Aug-2017 16:09:38 Build Clowder - pyclowder2 - geostreams - Default Job #27 (CATS-PYC26-JOB1-27) started building on agent buildserver-3.os.ncsa.edu
11-Aug-2017 16:09:38 Remote agent on host buildserver-3.os.ncsa.edu
11-Aug-2017 16:09:38 Build working directory is /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC26-JOB1
11-Aug-2017 16:09:38 Executing build Clowder - pyclowder2 - geostreams - Default Job #27 (CATS-PYC26-JOB1-27)
11-Aug-2017 16:09:38 Starting task 'Checkout Default Repository' of type 'com.atlassian.bamboo.plugins.vcs:task.vcs.checkout'
11-Aug-2017 16:09:38 Updating source code to revision: e2026b1974297b550adbbb55bb3f5f6119c91bab
11-Aug-2017 16:09:38 Creating local git repository in '/home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC26-JOB1/.git'.
11-Aug-2017 16:09:38 Initialized empty Git repository in /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC26-JOB1/.git/
11-Aug-2017 16:09:38 Fetching 'refs/heads/geostreams' from 'ssh://git@opensource.ncsa.illinois.edu:7999/cats/pyclowder2.git'. Will try to do a shallow fetch.
11-Aug-2017 16:09:38 Warning: Permanently added '[127.0.0.1]:37712' (RSA) to the list of known hosts.
11-Aug-2017 16:09:38 From ssh://127.0.0.1:37712/cats/pyclowder2
11-Aug-2017 16:09:38 * [new branch]      geostreams -> geostreams
11-Aug-2017 16:09:38 Checking out revision e2026b1974297b550adbbb55bb3f5f6119c91bab.
11-Aug-2017 16:09:38 Switched to branch 'geostreams'
11-Aug-2017 16:09:38 Updated source code to revision: e2026b1974297b550adbbb55bb3f5f6119c91bab
11-Aug-2017 16:09:38 Finished task 'Checkout Default Repository' with result: Success
11-Aug-2017 16:09:38 Running pre-build action: VCS Version Collector
11-Aug-2017 16:09:38 Starting task 'pytest' of type 'com.atlassian.bamboo.plugins.scripttask:task.builder.script'
11-Aug-2017 16:09:38
Beginning to execute external process for build 'Clowder - pyclowder2 - geostreams - Default Job #27 (CATS-PYC26-JOB1-27)'
... running command line:
/home/bamboo/bamboo-agent-home/temp/CATS-PYC26-JOB1-27-ScriptBuildTask-5556430487952047889.sh
... in: /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC26-JOB1
... using extra environment variables:
bamboo_planRepository_1_branch=geostreams
bamboo_capability_system_builder_node_Node_js_v6_9_1=/home/bamboo/node-v6.9.1/bin/node
bamboo_capability_system_builder_command_npm_6=/home/bamboo/node-v6.9.1/bin/npm
bamboo_capability_system_builder_command_buckminster_4_3=/home/bamboo/buckminster-4.3/buckminster
bamboo_capability_system_builder_command_buckminster_4_2=/home/bamboo/buckminster-4.2/buckminster
bamboo_planRepository_1_branchDisplayName=geostreams
bamboo_repository_revision_number=e2026b1974297b550adbbb55bb3f5f6119c91bab
bamboo_resultsUrl=https://opensource.ncsa.illinois.edu/bamboo/browse/CATS-PYC26-JOB1-27
bamboo_repository_127172662_previous_revision_number=7cf00c3e9dbb5a6c53ae5aeed1f6887b963d3111
bamboo_capability_system_builder_command_sphinx=/usr/bin/sphinx-build
bamboo_planRepository_1_name=pyclowder2
bamboo_repository_127172662_branch_name=geostreams
bamboo_build_working_directory=/home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC26-JOB1
bamboo_buildKey=CATS-PYC26-JOB1
bamboo_capability_system_os=linux
bamboo_repository_127172662_git_branch=geostreams
bamboo_shortPlanName=geostreams
bamboo_repository_127172662_git_repositoryUrl=ssh://git@opensource.ncsa.illinois.edu:7999/cats/pyclowder2.git
bamboo_repository_127172662_revision_number=e2026b1974297b550adbbb55bb3f5f6119c91bab
bamboo_planRepository_name=pyclowder2
bamboo_buildNumber=27
bamboo_repository_127172662_name=pyclowder2
bamboo_shortJobName=Default Job
bamboo_buildResultsUrl=https://opensource.ncsa.illinois.edu/bamboo/browse/CATS-PYC26-JOB1-27
bamboo_planRepository_repositoryUrl=ssh://git@opensource.ncsa.illinois.edu:7999/cats/pyclowder2.git
bamboo_agentId=143032321
bamboo_planName=Clowder - pyclowder2 - geostreams
bamboo_shortPlanKey=PYC26
bamboo_capability_system_builder_command_sbt_0_13=/home/bamboo/sbt-0.13.2/bin/sbt
bamboo_shortJobKey=JOB1
bamboo_capability_system_builder_node_Node_js_v0_10_28=/home/bamboo/node-v0.10.28/bin/node
bamboo_planRepository_revision=e2026b1974297b550adbbb55bb3f5f6119c91bab
bamboo_repository_previous_revision_number=7cf00c3e9dbb5a6c53ae5aeed1f6887b963d3111
bamboo_buildTimeStamp=2017-08-11T16:09:37.731-05:00
bamboo_capability_system_builder_command_npm=/home/bamboo/node-v0.10.28/bin/npm
bamboo_planRepository_previousRevision=7cf00c3e9dbb5a6c53ae5aeed1f6887b963d3111
bamboo_capability_system_builder_mvn2_Maven_2=/home/bamboo/apache-maven-2.2.1
bamboo_buildResultKey=CATS-PYC26-JOB1-27
bamboo_repository_git_branch=geostreams
bamboo_repository_branch_name=geostreams
bamboo_buildPlanName=Clowder - pyclowder2 - geostreams - Default Job
bamboo_planRepository_1_revision=e2026b1974297b550adbbb55bb3f5f6119c91bab
bamboo_capability_system_builder_command_python3=/usr/bin/python3
bamboo_repository_name=pyclowder2
bamboo_repository_127172662_git_username=
bamboo_buildFailed=false
bamboo_capability_system_docker_executable=/usr/bin/docker
bamboo_capability_system_builder_command_grunt=/home/bamboo/node-v0.10.28/bin/grunt
bamboo_planRepository_branch=geostreams
bamboo_agentWorkingDirectory=/home/bamboo/bamboo-agent-home/xml-data/build-dir
bamboo_capability_system_git_executable=/usr/bin/git
bamboo_planRepository_1_previousRevision=7cf00c3e9dbb5a6c53ae5aeed1f6887b963d3111
bamboo_repository_git_username=
bamboo_capability_system_builder_sbt_SBT_0_13_13=/home/bamboo/sbt-0.13.13
bamboo_planRepository_branchDisplayName=geostreams
bamboo_capability_system_builder_command_phantomjs=/home/bamboo/phantomjs-1.9.8/bin/phantomjs
bamboo_planRepository_1_type=bbserver
bamboo_planRepository_branchName=geostreams
bamboo_capability_system_builder_command_python2_7=/usr/bin/python2.7
bamboo_capability_system_hostname=buildserver-1
bamboo_capability_system_jdk_JDK=/home/bamboo/jdk1.8.0_66
bamboo_capability_system_software_mongo=/usr/bin/mongo
bamboo_plan_storageTag=plan-126976103
bamboo_capability_system_software_rabbitmq=/usr/sbin/rabbitmqctl
bamboo_capability_system_builder_command_casperjs=/home/bamboo/node-v0.10.28/bin/casperjs
bamboo_planRepository_type=bbserver
bamboo_planRepository_1_username=
bamboo_capability_system_jdk_JDK_1_8_0_66=/home/bamboo/jdk1.8.0_66
bamboo_repository_git_repositoryUrl=ssh://git@opensource.ncsa.illinois.edu:7999/cats/pyclowder2.git
bamboo_capability_system_builder_node_Node_js=/home/bamboo/node-v0.10.28/bin/node
bamboo_capability_system_builder_ant_Ant=/home/bamboo/apache-ant-1.9.4
bamboo_capability_system_builder_mvn3_Maven_3=/home/bamboo/apache-maven-3.3.9
bamboo_working_directory=/home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC26-JOB1
bamboo_planKey=CATS-PYC26
bamboo_planRepository_1_repositoryUrl=ssh://git@opensource.ncsa.illinois.edu:7999/cats/pyclowder2.git
bamboo_planRepository_username=
bamboo_capability_system_jdk_JDK_1_8=/home/bamboo/jdk1.8.0_66
bamboo_capability_system_jdk_JDK_1_6=/home/bamboo/jdk1.6.0_41
bamboo_capability_system_builder_command_mkcrx=/home/bamboo/mkcrx/mkcrx.sh
bamboo_capability_system_jdk_JDK_1_7=/home/bamboo/jdk1.7.0_60
bamboo_planRepository_1_branchName=geostreams
11-Aug-2017 16:09:39 New python executable in /tmp/virtualenv/pyclowder2/bin/python2
11-Aug-2017 16:09:39 Also creating executable in /tmp/virtualenv/pyclowder2/bin/python
11-Aug-2017 16:09:42 Installing setuptools, pkg_resources, pip, wheel...done.
11-Aug-2017 16:09:42 Running virtualenv with interpreter /usr/bin/python2
11-Aug-2017 16:09:42 Collecting enum34==1.1.6 (from -r requirements.txt (line 1))
11-Aug-2017 16:09:42   Using cached enum34-1.1.6-py2-none-any.whl
11-Aug-2017 16:09:42 Collecting et-xmlfile==1.0.1 (from -r requirements.txt (line 2))
11-Aug-2017 16:09:42 Collecting jdcal==1.3 (from -r requirements.txt (line 3))
11-Aug-2017 16:09:43 Collecting openpyxl==2.4.1 (from -r requirements.txt (line 4))
11-Aug-2017 16:09:43 Collecting pika==0.10.0 (from -r requirements.txt (line 5))
11-Aug-2017 16:09:43   Using cached pika-0.10.0-py2.py3-none-any.whl
11-Aug-2017 16:09:43 Collecting python-dateutil==2.6.0 (from -r requirements.txt (line 6))
11-Aug-2017 16:09:43   Using cached python_dateutil-2.6.0-py2.py3-none-any.whl
11-Aug-2017 16:09:43 Collecting pytz==2016.10 (from -r requirements.txt (line 7))
11-Aug-2017 16:09:43   Using cached pytz-2016.10-py2.py3-none-any.whl
11-Aug-2017 16:09:43 Collecting PyYAML==3.11 (from -r requirements.txt (line 8))
11-Aug-2017 16:09:43 Collecting requests==2.10.0 (from -r requirements.txt (line 9))
11-Aug-2017 16:09:43   Using cached requests-2.10.0-py2.py3-none-any.whl
11-Aug-2017 16:09:43 Collecting six==1.10.0 (from -r requirements.txt (line 10))
11-Aug-2017 16:09:43   Using cached six-1.10.0-py2.py3-none-any.whl
11-Aug-2017 16:09:43 Collecting wheel==0.24.0 (from -r requirements.txt (line 11))
11-Aug-2017 16:09:44   Using cached wheel-0.24.0-py2.py3-none-any.whl
11-Aug-2017 16:09:44 Collecting pytest==3.0.3 (from -r requirements.txt (line 12))
11-Aug-2017 16:09:44   Using cached pytest-3.0.3-py2.py3-none-any.whl
11-Aug-2017 16:09:44 Collecting pytest-pep8==1.0.6 (from -r requirements.txt (line 13))
11-Aug-2017 16:09:44 Collecting pytest-capturelog==0.7 (from -r requirements.txt (line 14))
11-Aug-2017 16:09:44 Collecting urllib3==1.14 (from -r requirements.txt (line 15))
11-Aug-2017 16:09:44   Using cached urllib3-1.14-py2.py3-none-any.whl
11-Aug-2017 16:09:44 Collecting py>=1.4.29 (from pytest==3.0.3->-r requirements.txt (line 12))
11-Aug-2017 16:09:44   Using cached py-1.4.34-py2.py3-none-any.whl
11-Aug-2017 16:09:44 Collecting pep8>=1.3 (from pytest-pep8==1.0.6->-r requirements.txt (line 13))
11-Aug-2017 16:09:44   Using cached pep8-1.7.0-py2.py3-none-any.whl
11-Aug-2017 16:09:44 Collecting pytest-cache (from pytest-pep8==1.0.6->-r requirements.txt (line 13))
11-Aug-2017 16:09:44 Collecting execnet>=1.1.dev1 (from pytest-cache->pytest-pep8==1.0.6->-r requirements.txt (line 13))
11-Aug-2017 16:09:44   Using cached execnet-1.4.1-py2.py3-none-any.whl
11-Aug-2017 16:09:44 Collecting apipkg>=1.4 (from execnet>=1.1.dev1->pytest-cache->pytest-pep8==1.0.6->-r requirements.txt (line 13))
11-Aug-2017 16:09:44   Using cached apipkg-1.4-py2.py3-none-any.whl
11-Aug-2017 16:09:44 Installing collected packages: enum34, et-xmlfile, jdcal, openpyxl, pika, six, python-dateutil, pytz, PyYAML, requests, wheel, py, pytest, pep8, apipkg, execnet, pytest-cache, pytest-pep8, pytest-capturelog, urllib3
11-Aug-2017 16:09:45   Found existing installation: wheel 0.30.0a0
11-Aug-2017 16:09:45     Uninstalling wheel-0.30.0a0:
11-Aug-2017 16:09:45       Successfully uninstalled wheel-0.30.0a0
11-Aug-2017 16:09:45 Successfully installed PyYAML-3.11 apipkg-1.4 enum34-1.1.6 et-xmlfile-1.0.1 execnet-1.4.1 jdcal-1.3 openpyxl-2.4.1 pep8-1.7.0 pika-0.10.0 py-1.4.34 pytest-3.0.3 pytest-cache-1.0 pytest-capturelog-0.7 pytest-pep8-1.0.6 python-dateutil-2.6.0 pytz-2016.10 requests-2.10.0 six-1.10.0 urllib3-1.14 wheel-0.24.0
11-Aug-2017 16:09:46 ============================= test session starts ==============================
11-Aug-2017 16:09:46 platform linux2 -- Python 2.7.12, pytest-3.0.3, py-1.4.34, pluggy-0.4.0
11-Aug-2017 16:09:46 rootdir: /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC26-JOB1, inifile: setup.cfg
11-Aug-2017 16:09:46 plugins: pep8-1.0.6, capturelog-0.7
11-Aug-2017 16:09:46 collected 39 items
11-Aug-2017 16:09:46
11-Aug-2017 16:09:46 setup.py .
11-Aug-2017 16:09:46 docs/source/conf.py .
11-Aug-2017 16:09:46 pyclowder/__init__.py .
11-Aug-2017 16:09:46 pyclowder/client.py .
11-Aug-2017 16:09:46 pyclowder/collections.py .
11-Aug-2017 16:09:46 pyclowder/connectors.py .
11-Aug-2017 16:09:46 pyclowder/datasets.py .
11-Aug-2017 16:09:46 pyclowder/extractors.py .
11-Aug-2017 16:09:46 pyclowder/files.py .
11-Aug-2017 16:09:46 pyclowder/sections.py .
11-Aug-2017 16:09:46 pyclowder/utils.py .
11-Aug-2017 16:09:46 pyclowder/geostreams/__init__.py .
11-Aug-2017 16:09:46 pyclowder/geostreams/cache.py .
11-Aug-2017 16:09:46 pyclowder/geostreams/csv.py .
11-Aug-2017 16:09:46 pyclowder/geostreams/datapoints.py .
11-Aug-2017 16:09:46 pyclowder/geostreams/datasets.py .
11-Aug-2017 16:09:46 pyclowder/geostreams/geocode_convert.py .
11-Aug-2017 16:09:46 pyclowder/geostreams/map_names.py .
11-Aug-2017 16:09:46 pyclowder/geostreams/sensors.py .
11-Aug-2017 16:09:46 pyclowder/geostreams/streams.py .
11-Aug-2017 16:09:46 pyclowder/geostreams/time_transformers.py .
11-Aug-2017 16:09:46 sample-extractors/echo/echo.py .
11-Aug-2017 16:09:46 sample-extractors/wordcount/wordcount.py .
11-Aug-2017 16:09:46 tests/__init__.py .
11-Aug-2017 16:09:46 tests/conftest.py .
11-Aug-2017 16:09:46 tests/test_datapoints.py .F
11-Aug-2017 16:09:46 tests/test_geostreams.py .FFF
11-Aug-2017 16:09:46 tests/test_sensors.py .FFF
11-Aug-2017 16:09:46 tests/test_streams.py .FFF
11-Aug-2017 16:09:46
11-Aug-2017 16:09:46 generated xml file: /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC26-JOB1/test-reports/results.xml
11-Aug-2017 16:09:46 =================================== FAILURES ===================================
11-Aug-2017 16:09:46 _____________________ test_datapoints_count_by_sensor_get ______________________
11-Aug-2017 16:09:46
11-Aug-2017 16:09:46 caplog = <pytest_capturelog.CaptureLogFuncArg object at 0x7f7d26cad2d0>
11-Aug-2017 16:09:46 host = 'http://localhost:9000/clowder', key = 'r1ek3rs'
11-Aug-2017 16:09:46
11-Aug-2017 16:09:46     def test_datapoints_count_by_sensor_get(caplog, host, key):
11-Aug-2017 16:09:46         caplog.setLevel(logging.DEBUG)
11-Aug-2017 16:09:46         client = DatapointsApi(host=host, key=key)
11-Aug-2017 16:09:46         response = client.datapoints_count_by_sensor_get(950)
11-Aug-2017 16:09:46 >       sensors = response.text
11-Aug-2017 16:09:46 E       AttributeError: 'NoneType' object has no attribute 'text'
11-Aug-2017 16:09:46
11-Aug-2017 16:09:46 tests/test_datapoints.py:10: AttributeError
11-Aug-2017 16:09:46 --------------------------------- Captured log ---------------------------------
11-Aug-2017 16:09:46 datapoints.py               45 DEBUG    Counting datapoints by sensor
11-Aug-2017 16:09:46 connectionpool.py          213 INFO     Starting new HTTP connection (1): localhost
11-Aug-2017 16:09:46 datapoints.py               49 ERROR    Error counting datapoints by sensor 950: not enough arguments for format string
11-Aug-2017 16:09:46 _________________________________ test_version _________________________________
11-Aug-2017 16:09:46
11-Aug-2017 16:09:46 caplog = <pytest_capturelog.CaptureLogFuncArg object at 0x7f7d26cbaf90>
11-Aug-2017 16:09:46 host = 'http://localhost:9000/clowder', key = 'r1ek3rs'
11-Aug-2017 16:09:46
11-Aug-2017 16:09:46     def test_version(caplog, host, key):
11-Aug-2017 16:09:46         caplog.setLevel(logging.DEBUG)
11-Aug-2017 16:09:46         client = ClowderClient(host=host, key=key)
11-Aug-2017 16:09:46 >       version = client.version()
11-Aug-2017 16:09:46
11-Aug-2017 16:09:46 tests/test_geostreams.py:12:
11-Aug-2017 16:09:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
11-Aug-2017 16:09:46 pyclowder/client.py:47: in version
11-Aug-2017 16:09:46     r = requests.get(url)
11-Aug-2017 16:09:46 /tmp/virtualenv/pyclowder2/local/lib/python2.7/site-packages/requests/api.py:71: in get
11-Aug-2017 16:09:46     return request('get', url, params=params, **kwargs)
11-Aug-2017 16:09:46 /tmp/virtualenv/pyclowder2/local/lib/python2.7/site-packages/requests/api.py:57: in request
11-Aug-2017 16:09:46     return session.request(method=method, url=url, **kwargs)
11-Aug-2017 16:09:46 /tmp/virtualenv/pyclowder2/local/lib/python2.7/site-packages/requests/sessions.py:475: in request
11-Aug-2017 16:09:46     resp = self.send(prep, **send_kwargs)
11-Aug-2017 16:09:46 /tmp/virtualenv/pyclowder2/local/lib/python2.7/site-packages/requests/sessions.py:585: in send
11-Aug-2017 16:09:46     r = adapter.send(request, **kwargs)
11-Aug-2017 16:09:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
11-Aug-2017 16:09:46
11-Aug-2017 16:09:46 self = <requests.adapters.HTTPAdapter object at 0x7f7d26cbae90>
11-Aug-2017 16:09:46 request = <PreparedRequest [GET]>, stream = False
11-Aug-2017 16:09:46 timeout = <requests.packages.urllib3.util.timeout.Timeout object at 0x7f7d26cba890>
11-Aug-2017 16:09:46 verify = True, cert = None, proxies = OrderedDict()
11-Aug-2017 16:09:46
11-Aug-2017 16:09:46     def send(self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None):
11-Aug-2017 16:09:46         """Sends PreparedRequest object. Returns Response object.
11-Aug-2017 16:09:46    
11-Aug-2017 16:09:46             :param request: The :class:`PreparedRequest <PreparedRequest>` being sent.
11-Aug-2017 16:09:46             :param stream: (optional) Whether to stream the request content.
11-Aug-2017 16:09:46             :param timeout: (optional) How long to wait for the server to send
11-Aug-2017 16:09:46                 data before giving up, as a float, or a :ref:`(connect timeout,
11-Aug-2017 16:09:46                 read timeout) <timeouts>` tuple.
11-Aug-2017 16:09:46             :type timeout: float or tuple
11-Aug-2017 16:09:46             :param verify: (optional) Whether to verify SSL certificates.
11-Aug-2017 16:09:46             :param cert: (optional) Any user-provided SSL certificate to be trusted.
11-Aug-2017 16:09:46             :param proxies: (optional) The proxies dictionary to apply to the request.
11-Aug-2017 16:09:46             """
11-Aug-2017 16:09:46    
11-Aug-2017 16:09:46         conn = self.get_connection(request.url, proxies)
11-Aug-2017 16:09:46    
11-Aug-2017 16:09:46         self.cert_verify(conn, request.url, verify, cert)
11-Aug-2017 16:09:46         url = self.request_url(request, proxies)
11-Aug-2017 16:09:46         self.add_headers(request)
11-Aug-2017 16:09:46    
11-Aug-2017 16:09:46         chunked = not (request.body is None or 'Content-Length' in request.headers)
11-Aug-2017 16:09:46    
11-Aug-2017 16:09:46         if isinstance(timeout, tuple):
11-Aug-2017 16:09:46             try:
11-Aug-2017 16:09:46                 connect, read = timeout
11-Aug-2017 16:09:46                 timeout = TimeoutSauce(connect=connect, read=read)
11-Aug-2017 16:09:46             except ValueError as e:
11-Aug-2017 16:09:46                 # this may raise a string formatting error.
11-Aug-2017 16:09:46                 err = ("Invalid timeout {0}. Pass a (connect, read) "
11-Aug-2017 16:09:46                        "timeout tuple, or a single float to set "
11-Aug-2017 16:09:46                        "both timeouts to the same value".format(timeout))
11-Aug-2017 16:09:46                 raise ValueError(err)
11-Aug-2017 16:09:46         else:
11-Aug-2017 16:09:46             timeout = TimeoutSauce(connect=timeout, read=timeout)
11-Aug-2017 16:09:46    
11-Aug-2017 16:09:46         try:
11-Aug-2017 16:09:46             if not chunked:
11-Aug-2017 16:09:46                 resp = conn.urlopen(
11-Aug-2017 16:09:46                     method=request.method,
11-Aug-2017 16:09:46                     url=url,
11-Aug-2017 16:09:46                     body=request.body,
11-Aug-2017 16:09:46                     headers=request.headers,
11-Aug-2017 16:09:46                     redirect=False,
11-Aug-2017 16:09:46                     assert_same_host=False,
11-Aug-2017 16:09:46                     preload_content=False,
11-Aug-2017 16:09:46                     decode_content=False,
11-Aug-2017 16:09:46                     retries=self.max_retries,
11-Aug-2017 16:09:46                     timeout=timeout
11-Aug-2017 16:09:46                 )
11-Aug-2017 16:09:46    
11-Aug-2017 16:09:46             # Send the request.
11-Aug-2017 16:09:46             else:
11-Aug-2017 16:09:46                 if hasattr(conn, 'proxy_pool'):
11-Aug-2017 16:09:46                     conn = conn.proxy_pool
11-Aug-2017 16:09:46    
11-Aug-2017 16:09:46                 low_conn = conn._get_conn(timeout=DEFAULT_POOL_TIMEOUT)
11-Aug-2017 16:09:46    
11-Aug-2017 16:09:46                 try:
11-Aug-2017 16:09:46                     low_conn.putrequest(request.method,
11-Aug-2017 16:09:46                                         url,
11-Aug-2017 16:09:46                                         skip_accept_encoding=True)
11-Aug-2017 16:09:46    
11-Aug-2017 16:09:46                     for header, value in request.headers.items():
11-Aug-2017 16:09:46                         low_conn.putheader(header, value)
11-Aug-2017 16:09:46    
11-Aug-2017 16:09:46                     low_conn.endheaders()
11-Aug-2017 16:09:46    
11-Aug-2017 16:09:46                     for i in request.body:
11-Aug-2017 16:09:46                         low_conn.send(hex(len(i))[2:].encode('utf-8'))
11-Aug-2017 16:09:46                         low_conn.send(b'\r\n')
11-Aug-2017 16:09:46                         low_conn.send(i)
11-Aug-2017 16:09:46                         low_conn.send(b'\r\n')
11-Aug-2017 16:09:46                     low_conn.send(b'0\r\n\r\n')
11-Aug-2017 16:09:46    
11-Aug-2017 16:09:46                     # Receive the response from the server
11-Aug-2017 16:09:46                     try:
11-Aug-2017 16:09:46                         # For Python 2.7+ versions, use buffering of HTTP
11-Aug-2017 16:09:46                         # responses
11-Aug-2017 16:09:46                         r = low_conn.getresponse(buffering=True)
11-Aug-2017 16:09:46                     except TypeError:
11-Aug-2017 16:09:46                         # For compatibility with Python 2.6 versions and back
11-Aug-2017 16:09:46                         r = low_conn.getresponse()
11-Aug-2017 16:09:46    
11-Aug-2017 16:09:46                     resp = HTTPResponse.from_httplib(
11-Aug-2017 16:09:46                         r,
11-Aug-2017 16:09:46                         pool=conn,
11-Aug-2017 16:09:46                         connection=low_conn,
11-Aug-2017 16:09:46                         preload_content=False,
11-Aug-2017 16:09:46                         decode_content=False
11-Aug-2017 16:09:46                     )
11-Aug-2017 16:09:46                 except:
11-Aug-2017 16:09:46                     # If we hit any problems here, clean up the connection.
11-Aug-2017 16:09:46                     # Then, reraise so that we can handle the actual exception.
11-Aug-2017 16:09:46                     low_conn.close()
11-Aug-2017 16:09:46                     raise
11-Aug-2017 16:09:46    
11-Aug-2017 16:09:46         except (ProtocolError, socket.error) as err:
11-Aug-2017 16:09:46             raise ConnectionError(err, request=request)
11-Aug-2017 16:09:46    
11-Aug-2017 16:09:46         except MaxRetryError as e:
11-Aug-2017 16:09:46             if isinstance(e.reason, ConnectTimeoutError):
11-Aug-2017 16:09:46                 # TODO: Remove this in 3.0.0: see #2811
11-Aug-2017 16:09:46                 if not isinstance(e.reason, NewConnectionError):
11-Aug-2017 16:09:46                     raise ConnectTimeout(e, request=request)
11-Aug-2017 16:09:46    
11-Aug-2017 16:09:46             if isinstance(e.reason, ResponseError):
11-Aug-2017 16:09:46                 raise RetryError(e, request=request)
11-Aug-2017 16:09:46    
11-Aug-2017 16:09:46             if isinstance(e.reason, _ProxyError):
11-Aug-2017 16:09:46                 raise ProxyError(e, request=request)
11-Aug-2017 16:09:46    
11-Aug-2017 16:09:46 >           raise ConnectionError(e, request=request)
11-Aug-2017 16:09:46 E           ConnectionError: HTTPConnectionPool(host='localhost', port=9000): Max retries exceeded with url: /clowder/api/version (Caused by NewConnectionError('<requests.packages.urllib3.connection.HTTPConnection object at 0x7f7d26a46650>: Failed to establish a new connection: [Errno 111] Connection refused',))
11-Aug-2017 16:09:46
11-Aug-2017 16:09:46 /tmp/virtualenv/pyclowder2/local/lib/python2.7/site-packages/requests/adapters.py:467: ConnectionError
11-Aug-2017 16:09:46 --------------------------------- Captured log ---------------------------------
11-Aug-2017 16:09:46 client.py                   46 DEBUG    GET http://localhost:9000/clowder/api/version
11-Aug-2017 16:09:46 connectionpool.py          213 INFO     Starting new HTTP connection (1): localhost
11-Aug-2017 16:09:46 _______________________________ test_get_sensors _______________________________
11-Aug-2017 16:09:46
11-Aug-2017 16:09:46 caplog = <pytest_capturelog.CaptureLogFuncArg object at 0x7f7d26a24450>
11-Aug-2017 16:09:46 host = 'http://localhost:9000/clowder', key = 'r1ek3rs'
11-Aug-2017 16:09:46
11-Aug-2017 16:09:46     def test_get_sensors(caplog, host, key):
11-Aug-2017 16:09:46         caplog.setLevel(logging.DEBUG)
11-Aug-2017 16:09:46         client = SensorsApi(host=host, key=key)
11-Aug-2017 16:09:46         response = client.sensors_get()
11-Aug-2017 16:09:46 >       sensors = response.json()
11-Aug-2017 16:09:46 E       AttributeError: 'NoneType' object has no attribute 'json'
11-Aug-2017 16:09:46
11-Aug-2017 16:09:46 tests/test_geostreams.py:21: AttributeError
11-Aug-2017 16:09:46 --------------------------------- Captured log ---------------------------------
11-Aug-2017 16:09:46 sensors.py                  30 DEBUG    Getting all sensors
11-Aug-2017 16:09:46 connectionpool.py          213 INFO     Starting new HTTP connection (1): localhost
11-Aug-2017 16:09:46 sensors.py                  34 ERROR    Error retrieving sensor list: not enough arguments for format string
11-Aug-2017 16:09:46 ____________________________ test_raise_for_status _____________________________
11-Aug-2017 16:09:46
11-Aug-2017 16:09:46 caplog = <pytest_capturelog.CaptureLogFuncArg object at 0x7f7d26a2c390>
11-Aug-2017 16:09:46 host = 'http://localhost:9000/clowder', key = 'r1ek3rs'
11-Aug-2017 16:09:46
11-Aug-2017 16:09:46     def test_raise_for_status(caplog, host, key):
11-Aug-2017 16:09:46         client = ClowderClient(host=host, key=key)
11-Aug-2017 16:09:46         try:
11-Aug-2017 16:09:46 >           client.get_json("this_path_does_not_exist")
11-Aug-2017 16:09:46
11-Aug-2017 16:09:46 tests/test_geostreams.py:29:
11-Aug-2017 16:09:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
11-Aug-2017 16:09:46 pyclowder/client.py:69: in get_json
11-Aug-2017 16:09:46     r = requests.get(url, headers=self.headers)
11-Aug-2017 16:09:46 /tmp/virtualenv/pyclowder2/local/lib/python2.7/site-packages/requests/api.py:71: in get
11-Aug-2017 16:09:46     return request('get', url, params=params, **kwargs)
11-Aug-2017 16:09:46 /tmp/virtualenv/pyclowder2/local/lib/python2.7/site-packages/requests/api.py:57: in request
11-Aug-2017 16:09:46     return session.request(method=method, url=url, **kwargs)
11-Aug-2017 16:09:46 /tmp/virtualenv/pyclowder2/local/lib/python2.7/site-packages/requests/sessions.py:475: in request
11-Aug-2017 16:09:46     resp = self.send(prep, **send_kwargs)
11-Aug-2017 16:09:46 /tmp/virtualenv/pyclowder2/local/lib/python2.7/site-packages/requests/sessions.py:585: in send
11-Aug-2017 16:09:46     r = adapter.send(request, **kwargs)
11-Aug-2017 16:09:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
11-Aug-2017 16:09:46
11-Aug-2017 16:09:46 self = <requests.adapters.HTTPAdapter object at 0x7f7d26a54490>
11-Aug-2017 16:09:46 request = <PreparedRequest [GET]>, stream = False
11-Aug-2017 16:09:46 timeout = <requests.packages.urllib3.util.timeout.Timeout object at 0x7f7d26a54910>
11-Aug-2017 16:09:46 verify = True, cert = None, proxies = OrderedDict()
11-Aug-2017 16:09:46
11-Aug-2017 16:09:46     def send(self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None):
11-Aug-2017 16:09:46         """Sends PreparedRequest object. Returns Response object.
11-Aug-2017 16:09:46    
11-Aug-2017 16:09:46             :param request: The :class:`PreparedRequest <PreparedRequest>` being sent.
11-Aug-2017 16:09:46             :param stream: (optional) Whether to stream the request content.
11-Aug-2017 16:09:46             :param timeout: (optional) How long to wait for the server to send
11-Aug-2017 16:09:46                 data before giving up, as a float, or a :ref:`(connect timeout,
11-Aug-2017 16:09:46                 read timeout) <timeouts>` tuple.
11-Aug-2017 16:09:46             :type timeout: float or tuple
11-Aug-2017 16:09:46             :param verify: (optional) Whether to verify SSL certificates.
11-Aug-2017 16:09:46             :param cert: (optional) Any user-provided SSL certificate to be trusted.
11-Aug-2017 16:09:46             :param proxies: (optional) The proxies dictionary to apply to the request.
11-Aug-2017 16:09:46             """
11-Aug-2017 16:09:46    
11-Aug-2017 16:09:46         conn = self.get_connection(request.url, proxies)
11-Aug-2017 16:09:46    
11-Aug-2017 16:09:46         self.cert_verify(conn, request.url, verify, cert)
11-Aug-2017 16:09:46         url = self.request_url(request, proxies)
11-Aug-2017 16:09:46         self.add_headers(request)
11-Aug-2017 16:09:46    
11-Aug-2017 16:09:46         chunked = not (request.body is None or 'Content-Length' in request.headers)
11-Aug-2017 16:09:46    
11-Aug-2017 16:09:46         if isinstance(timeout, tuple):
11-Aug-2017 16:09:46             try:
11-Aug-2017 16:09:46                 connect, read = timeout
11-Aug-2017 16:09:46                 timeout = TimeoutSauce(connect=connect, read=read)
11-Aug-2017 16:09:46             except ValueError as e:
11-Aug-2017 16:09:46                 # this may raise a string formatting error.
11-Aug-2017 16:09:46                 err = ("Invalid timeout {0}. Pass a (connect, read) "
11-Aug-2017 16:09:46                        "timeout tuple, or a single float to set "
11-Aug-2017 16:09:46                        "both timeouts to the same value".format(timeout))
11-Aug-2017 16:09:46                 raise ValueError(err)
11-Aug-2017 16:09:46         else:
11-Aug-2017 16:09:46             timeout = TimeoutSauce(connect=timeout, read=timeout)
11-Aug-2017 16:09:46    
11-Aug-2017 16:09:46         try:
11-Aug-2017 16:09:46             if not chunked:
11-Aug-2017 16:09:46                 resp = conn.urlopen(
11-Aug-2017 16:09:46                     method=request.method,
11-Aug-2017 16:09:46                     url=url,
11-Aug-2017 16:09:46                     body=request.body,
11-Aug-2017 16:09:46                     headers=request.headers,
11-Aug-2017 16:09:46                     redirect=False,
11-Aug-2017 16:09:46                     assert_same_host=False,
11-Aug-2017 16:09:46                     preload_content=False,
11-Aug-2017 16:09:46                     decode_content=False,
11-Aug-2017 16:09:46                     retries=self.max_retries,
11-Aug-2017 16:09:46                     timeout=timeout
11-Aug-2017 16:09:46                 )
11-Aug-2017 16:09:46    
11-Aug-2017 16:09:46             # Send the request.
11-Aug-2017 16:09:46             else:
11-Aug-2017 16:09:46                 if hasattr(conn, 'proxy_pool'):
11-Aug-2017 16:09:46                     conn = conn.proxy_pool
11-Aug-2017 16:09:46    
11-Aug-2017 16:09:46                 low_conn = conn._get_conn(timeout=DEFAULT_POOL_TIMEOUT)
11-Aug-2017 16:09:46    
11-Aug-2017 16:09:46                 try:
11-Aug-2017 16:09:46                     low_conn.putrequest(request.method,
11-Aug-2017 16:09:46                                         url,
11-Aug-2017 16:09:46                                         skip_accept_encoding=True)
11-Aug-2017 16:09:46    
11-Aug-2017 16:09:46                     for header, value in request.headers.items():
11-Aug-2017 16:09:46                         low_conn.putheader(header, value)
11-Aug-2017 16:09:46    
11-Aug-2017 16:09:46                     low_conn.endheaders()
11-Aug-2017 16:09:46    
11-Aug-2017 16:09:46                     for i in request.body:
11-Aug-2017 16:09:46                         low_conn.send(hex(len(i))[2:].encode('utf-8'))
11-Aug-2017 16:09:46                         low_conn.send(b'\r\n')
11-Aug-2017 16:09:46                         low_conn.send(i)
11-Aug-2017 16:09:46                         low_conn.send(b'\r\n')
11-Aug-2017 16:09:46                     low_conn.send(b'0\r\n\r\n')
11-Aug-2017 16:09:46    
11-Aug-2017 16:09:46                     # Receive the response from the server
11-Aug-2017 16:09:46                     try:
11-Aug-2017 16:09:46                         # For Python 2.7+ versions, use buffering of HTTP
11-Aug-2017 16:09:46                         # responses
11-Aug-2017 16:09:46                         r = low_conn.getresponse(buffering=True)
11-Aug-2017 16:09:46                     except TypeError:
11-Aug-2017 16:09:46                         # For compatibility with Python 2.6 versions and back
11-Aug-2017 16:09:46                         r = low_conn.getresponse()
11-Aug-2017 16:09:46    
11-Aug-2017 16:09:46                     resp = HTTPResponse.from_httplib(
11-Aug-2017 16:09:46                         r,
11-Aug-2017 16:09:46                         pool=conn,
11-Aug-2017 16:09:46                         connection=low_conn,
11-Aug-2017 16:09:46                         preload_content=False,
11-Aug-2017 16:09:46                         decode_content=False
11-Aug-2017 16:09:46                     )
11-Aug-2017 16:09:46                 except:
11-Aug-2017 16:09:46                     # If we hit any problems here, clean up the connection.
11-Aug-2017 16:09:46                     # Then, reraise so that we can handle the actual exception.
11-Aug-2017 16:09:46                     low_conn.close()
11-Aug-2017 16:09:46                     raise
11-Aug-2017 16:09:46    
11-Aug-2017 16:09:46         except (ProtocolError, socket.error) as err:
11-Aug-2017 16:09:46             raise ConnectionError(err, request=request)
11-Aug-2017 16:09:46    
11-Aug-2017 16:09:46         except MaxRetryError as e:
11-Aug-2017 16:09:46             if isinstance(e.reason, ConnectTimeoutError):
11-Aug-2017 16:09:46                 # TODO: Remove this in 3.0.0: see #2811
11-Aug-2017 16:09:46                 if not isinstance(e.reason, NewConnectionError):
11-Aug-2017 16:09:46                     raise ConnectTimeout(e, request=request)
11-Aug-2017 16:09:46    
11-Aug-2017 16:09:46             if isinstance(e.reason, ResponseError):
11-Aug-2017 16:09:46                 raise RetryError(e, request=request)
11-Aug-2017 16:09:46    
11-Aug-2017 16:09:46             if isinstance(e.reason, _ProxyError):
11-Aug-2017 16:09:46                 raise ProxyError(e, request=request)
11-Aug-2017 16:09:46    
11-Aug-2017 16:09:46 >           raise ConnectionError(e, request=request)
11-Aug-2017 16:09:46 E           ConnectionError: HTTPConnectionPool(host='localhost', port=9000): Max retries exceeded with url: /clowder/apithis_path_does_not_exist (Caused by NewConnectionError('<requests.packages.urllib3.connection.HTTPConnection object at 0x7f7d26a549d0>: Failed to establish a new connection: [Errno 111] Connection refused',))
11-Aug-2017 16:09:46
11-Aug-2017 16:09:46 /tmp/virtualenv/pyclowder2/local/lib/python2.7/site-packages/requests/adapters.py:467: ConnectionError
11-Aug-2017 16:09:46 --------------------------------- Captured log ---------------------------------
11-Aug-2017 16:09:46 connectionpool.py          213 INFO     Starting new HTTP connection (1): localhost
11-Aug-2017 16:09:46 ______________________________ test_sensors_post _______________________________
11-Aug-2017 16:09:46
11-Aug-2017 16:09:46 caplog = <pytest_capturelog.CaptureLogFuncArg object at 0x7f7d26a24410>
11-Aug-2017 16:09:46 host = 'http://localhost:9000/clowder', key = 'r1ek3rs'
11-Aug-2017 16:09:46
11-Aug-2017 16:09:46     def test_sensors_post(caplog, host, key):
11-Aug-2017 16:09:46         global sensor_id
11-Aug-2017 16:09:46         caplog.setLevel(logging.DEBUG)
11-Aug-2017 16:09:46         client = SensorsApi(host=host, key=key)
11-Aug-2017 16:09:46         sensor_json = client.sensor_create_json("Test Sensor", 40.1149202, -88.2270582, 0, "", "ER")
11-Aug-2017 16:09:46         response = client.sensor_post(sensor_json)
11-Aug-2017 16:09:46 >       body = response.json()
11-Aug-2017 16:09:46 E       AttributeError: 'NoneType' object has no attribute 'json'
11-Aug-2017 16:09:46
11-Aug-2017 16:09:46 tests/test_sensors.py:14: AttributeError
11-Aug-2017 16:09:46 --------------------------------- Captured log ---------------------------------
11-Aug-2017 16:09:46 sensors.py                  70 DEBUG    Adding sensor
11-Aug-2017 16:09:46 connectionpool.py          213 INFO     Starting new HTTP connection (1): localhost
11-Aug-2017 16:09:46 client.py                  149 ERROR    POST http://localhost:9000/clowder/api/geostreams/sensors: HTTPConnectionPool(host='localhost', port=9000): Max retries exceeded with url: /clowder/api/geostreams/sensors?key=r1ek3rs (Caused by NewConnectionError('<requests.packages.urllib3.connection.HTTPConnection object at 0x7f7d26a24e90>: Failed to establish a new connection: [Errno 111] Connection refused',))
11-Aug-2017 16:09:46 _______________________________ test_sensors_get _______________________________
11-Aug-2017 16:09:46
11-Aug-2017 16:09:46 caplog = <pytest_capturelog.CaptureLogFuncArg object at 0x7f7d26a5c290>
11-Aug-2017 16:09:46 host = 'http://localhost:9000/clowder', key = 'r1ek3rs'
11-Aug-2017 16:09:46
11-Aug-2017 16:09:46     def test_sensors_get(caplog, host, key):
11-Aug-2017 16:09:46         global sensor_id
11-Aug-2017 16:09:46         caplog.setLevel(logging.DEBUG)
11-Aug-2017 16:09:46         client = SensorsApi(host=host, key=key)
11-Aug-2017 16:09:46         response = client.sensor_get(sensor_id)
11-Aug-2017 16:09:46 >       sensor = response.json()
11-Aug-2017 16:09:46 E       AttributeError: 'NoneType' object has no attribute 'json'
11-Aug-2017 16:09:46
11-Aug-2017 16:09:46 tests/test_sensors.py:25: AttributeError
11-Aug-2017 16:09:46 --------------------------------- Captured log ---------------------------------
11-Aug-2017 16:09:46 sensors.py                  43 DEBUG    Getting sensor
11-Aug-2017 16:09:46 connectionpool.py          213 INFO     Starting new HTTP connection (1): localhost
11-Aug-2017 16:09:46 sensors.py                  47 ERROR    Error retrieving sensor : not enough arguments for format string
11-Aug-2017 16:09:46 _____________________________ test_sensors_delete ______________________________
11-Aug-2017 16:09:46
11-Aug-2017 16:09:46 caplog = <pytest_capturelog.CaptureLogFuncArg object at 0x7f7d26a2c110>
11-Aug-2017 16:09:46 host = 'http://localhost:9000/clowder', key = 'r1ek3rs'
11-Aug-2017 16:09:46
11-Aug-2017 16:09:46     def test_sensors_delete(caplog, host, key):
11-Aug-2017 16:09:46         global sensor_id
11-Aug-2017 16:09:46         caplog.setLevel(logging.DEBUG)
11-Aug-2017 16:09:46         client = SensorsApi(host=host, key=key)
11-Aug-2017 16:09:46         response = client.sensor_delete(sensor_id)
11-Aug-2017 16:09:46 >       sensor = response.json()
11-Aug-2017 16:09:46 E       AttributeError: 'NoneType' object has no attribute 'json'
11-Aug-2017 16:09:46
11-Aug-2017 16:09:46 tests/test_sensors.py:35: AttributeError
11-Aug-2017 16:09:46 --------------------------------- Captured log ---------------------------------
11-Aug-2017 16:09:46 sensors.py                 104 DEBUG    Deleting sensor
11-Aug-2017 16:09:46 connectionpool.py          213 INFO     Starting new HTTP connection (1): localhost
11-Aug-2017 16:09:46 client.py                  214 ERROR    DELETE http://localhost:9000/clowder/api/geostreams/sensors/: HTTPConnectionPool(host='localhost', port=9000): Max retries exceeded with url: /clowder/api/geostreams/sensors/?key=r1ek3rs (Caused by NewConnectionError('<requests.packages.urllib3.connection.HTTPConnection object at 0x7f7d26a568d0>: Failed to establish a new connection: [Errno 111] Connection refused',))
11-Aug-2017 16:09:46 ______________________________ test_streams_post _______________________________
11-Aug-2017 16:09:46
11-Aug-2017 16:09:46 caplog = <pytest_capturelog.CaptureLogFuncArg object at 0x7f7d26a56dd0>
11-Aug-2017 16:09:46 host = 'http://localhost:9000/clowder', key = 'r1ek3rs'
11-Aug-2017 16:09:46
11-Aug-2017 16:09:46     def test_streams_post(caplog, host, key):
11-Aug-2017 16:09:46         global sensor_id, stream_id
11-Aug-2017 16:09:46         caplog.setLevel(logging.DEBUG)
11-Aug-2017 16:09:46         sensor_client = SensorsApi(host=host, key=key)
11-Aug-2017 16:09:46         sensor_json = sensor_client.sensor_create_json("Test Sensor", 40.1149202, -88.2270582, 0, "", "ER")
11-Aug-2017 16:09:46         sensor_body = sensor_client.sensor_post_json(sensor_json)
11-Aug-2017 16:09:46 >       sensor_id = sensor_body['id']
11-Aug-2017 16:09:46 E       TypeError: 'NoneType' object has no attribute '__getitem__'
11-Aug-2017 16:09:46
11-Aug-2017 16:09:46 tests/test_streams.py:16: TypeError
11-Aug-2017 16:09:46 --------------------------------- Captured log ---------------------------------
11-Aug-2017 16:09:46 sensors.py                  83 DEBUG    Adding or getting sensor
11-Aug-2017 16:09:46 sensors.py                  56 DEBUG    Getting sensor Test Sensor
11-Aug-2017 16:09:46 connectionpool.py          213 INFO     Starting new HTTP connection (1): localhost
11-Aug-2017 16:09:46 sensors.py                  60 ERROR    Error retrieving sensor Test Sensor: not enough arguments for format string
11-Aug-2017 16:09:46 sensors.py                  95 ERROR    Error adding sensor Test Sensor: 'NoneType' object has no attribute 'json'
11-Aug-2017 16:09:46 _______________________________ test_streams_get _______________________________
11-Aug-2017 16:09:46
11-Aug-2017 16:09:46 caplog = <pytest_capturelog.CaptureLogFuncArg object at 0x7f7d26a5c790>
11-Aug-2017 16:09:46 host = 'http://localhost:9000/clowder', key = 'r1ek3rs'
11-Aug-2017 16:09:46
11-Aug-2017 16:09:46     def test_streams_get(caplog, host, key):
11-Aug-2017 16:09:46         global sensor_id, stream_id
11-Aug-2017 16:09:46         caplog.setLevel(logging.DEBUG)
11-Aug-2017 16:09:46         stream_client = StreamsApi(host=host, key=key)
11-Aug-2017 16:09:46 >       stream = stream_client.stream_get_by_name("Test Sensor")
11-Aug-2017 16:09:46 E       AttributeError: 'StreamsApi' object has no attribute 'stream_get_by_name'
11-Aug-2017 16:09:46
11-Aug-2017 16:09:46 tests/test_streams.py:29: AttributeError
11-Aug-2017 16:09:46 _____________________________ test_streams_delete ______________________________
11-Aug-2017 16:09:46
11-Aug-2017 16:09:46 caplog = <pytest_capturelog.CaptureLogFuncArg object at 0x7f7d2699d7d0>
11-Aug-2017 16:09:46 host = 'http://localhost:9000/clowder', key = 'r1ek3rs'
11-Aug-2017 16:09:46
11-Aug-2017 16:09:46     def test_streams_delete(caplog, host, key):
11-Aug-2017 16:09:46         global sensor_id, stream_id
11-Aug-2017 16:09:46         caplog.setLevel(logging.DEBUG)
11-Aug-2017 16:09:46         sensor_client = SensorsApi(host=host, key=key)
11-Aug-2017 16:09:46         response = sensor_client.sensor_delete(sensor_id)
11-Aug-2017 16:09:46    
11-Aug-2017 16:09:46         stream_client = StreamsApi(host=host, key=key)
11-Aug-2017 16:09:46         response = stream_client.stream_delete(stream_id)
11-Aug-2017 16:09:46    
11-Aug-2017 16:09:46 >       stream = response.json()
11-Aug-2017 16:09:46 E       AttributeError: 'NoneType' object has no attribute 'json'
11-Aug-2017 16:09:46
11-Aug-2017 16:09:46 tests/test_streams.py:43: AttributeError
11-Aug-2017 16:09:46 --------------------------------- Captured log ---------------------------------
11-Aug-2017 16:09:46 sensors.py                 104 DEBUG    Deleting sensor
11-Aug-2017 16:09:46 connectionpool.py          213 INFO     Starting new HTTP connection (1): localhost
11-Aug-2017 16:09:46 client.py                  214 ERROR    DELETE http://localhost:9000/clowder/api/geostreams/sensors/: HTTPConnectionPool(host='localhost', port=9000): Max retries exceeded with url: /clowder/api/geostreams/sensors/?key=r1ek3rs (Caused by NewConnectionError('<requests.packages.urllib3.connection.HTTPConnection object at 0x7f7d2699d190>: Failed to establish a new connection: [Errno 111] Connection refused',))
11-Aug-2017 16:09:46 streams.py                  91 DEBUG    Deleting stream
11-Aug-2017 16:09:46 connectionpool.py          213 INFO     Starting new HTTP connection (1): localhost
11-Aug-2017 16:09:46 client.py                  214 ERROR    DELETE http://localhost:9000/clowder/api/geostreams/streams/: HTTPConnectionPool(host='localhost', port=9000): Max retries exceeded with url: /clowder/api/geostreams/streams/?key=r1ek3rs (Caused by NewConnectionError('<requests.packages.urllib3.connection.HTTPConnection object at 0x7f7d26a56fd0>: Failed to establish a new connection: [Errno 111] Connection refused',))
11-Aug-2017 16:09:46 ============================ pytest-warning summary ============================
11-Aug-2017 16:09:46 WI1 /tmp/virtualenv/pyclowder2/local/lib/python2.7/site-packages/pytest_capturelog.py:171 'pytest_runtest_makereport' hook uses deprecated __multicall__ argument
11-Aug-2017 16:09:46 WC1 None pytest_funcarg__caplog: declaring fixtures using "pytest_funcarg__" prefix is deprecated and scheduled to be removed in pytest 4.0.  Please remove the prefix and use the @pytest.fixture decorator instead.
11-Aug-2017 16:09:46 WC1 None pytest_funcarg__capturelog: declaring fixtures using "pytest_funcarg__" prefix is deprecated and scheduled to be removed in pytest 4.0.  Please remove the prefix and use the @pytest.fixture decorator instead.
11-Aug-2017 16:09:46 =========== 10 failed, 29 passed, 3 pytest-warnings in 0.58 seconds ============
11-Aug-2017 16:09:46 Failing task since return code of [/home/bamboo/bamboo-agent-home/temp/CATS-PYC26-JOB1-27-ScriptBuildTask-5556430487952047889.sh] was 1 while expected 0
11-Aug-2017 16:09:46 Finished task 'pytest' with result: Failed
11-Aug-2017 16:09:46 Starting task 'test results' of type 'com.atlassian.bamboo.plugins.testresultparser:task.testresultparser.junit'
11-Aug-2017 16:09:46 Parsing test results under /home/bamboo/bamboo-agent-home/xml-data/build-dir/CATS-PYC26-JOB1...
11-Aug-2017 16:09:46 Failing task since 10 failing test cases were found.
11-Aug-2017 16:09:46 Finished task 'test results' with result: Failed
11-Aug-2017 16:09:46 Running post build plugin 'Docker Container Cleanup'
11-Aug-2017 16:09:46 Running post build plugin 'NCover Results Collector'
11-Aug-2017 16:09:46 Running post build plugin 'Clover Results Collector'
11-Aug-2017 16:09:46 Running post build plugin 'npm Cache Cleanup'
11-Aug-2017 16:09:46 Running post build plugin 'Artifact Copier'
11-Aug-2017 16:09:46 Finalising the build...
11-Aug-2017 16:09:46 Stopping timer.
11-Aug-2017 16:09:46 Build CATS-PYC26-JOB1-27 completed.
11-Aug-2017 16:09:46 Running on server: post build plugin 'NCover Results Collector'
11-Aug-2017 16:09:46 Running on server: post build plugin 'Build Hanging Detection Configuration'
11-Aug-2017 16:09:46 Running on server: post build plugin 'Clover Delta Calculator'
11-Aug-2017 16:09:46 Running on server: post build plugin 'Maven Dependencies Postprocessor'
11-Aug-2017 16:09:46 All post build plugins have finished
11-Aug-2017 16:09:46 Generating build results summary...
11-Aug-2017 16:09:47 Saving build results to disk...
11-Aug-2017 16:09:47 Logging substituted variables...
11-Aug-2017 16:09:47 Indexing build results...
11-Aug-2017 16:09:47 Finished building CATS-PYC26-JOB1-27.