Uploaded image for project: 'Nuxeo Drive '
  1. Nuxeo Drive
  2. NXDRIVE-820

Reports are now memory and disk space efficient

    XMLWordPrintable

    Details

    • Type: Bug
    • Status: Resolved
    • Priority: Minor
    • Resolution: Fixed
    • Affects Version/s: 2.2.323
    • Fix Version/s: 2.4.1
    • Component/s: Framework

      Description

      When the trace is too long, there is a MemoryError raised when writing down the debug.log file to the zipped report.

      [INFO]      [exec] ERROR nuxeo-drive-client/tests/test_watchers.py::TestWatchers::test_local_watchdog_creation
      [INFO]      [exec] FAIL nuxeo-drive-client/tests/test_watchers.py::TestWatchers::test_local_watchdog_creation
      [INFO]      [exec] =================================== ERRORS ====================================
      [INFO]      [exec] _______ ERROR at teardown of TestWatchers.test_local_watchdog_creation ________
      [INFO]      [exec] self = <tests.test_watchers.TestWatchers testMethod=test_local_watchdog_creation>
      [INFO]      [exec] result = <TestCaseFunction 'test_local_watchdog_creation'>
      [INFO]      [exec]     def run(self, result=None):
      [INFO]      [exec]         orig_result = result
      [INFO]      [exec]         if result is None:
      [INFO]      [exec]             result = self.defaultTestResult()
      [INFO]      [exec]             startTestRun = getattr(result, 'startTestRun', None)
      [INFO]      [exec]             if startTestRun is not None:
      [INFO]      [exec]                 startTestRun()
      [INFO]      [exec]         self._resultForDoCleanups = result
      [INFO]      [exec]         result.startTest(self)
      [INFO]      [exec]         testMethod = getattr(self, self._testMethodName)
      [INFO]      [exec]         if (getattr(self.__class__, "__unittest_skip__", False) or
      [INFO]      [exec]             getattr(testMethod, "__unittest_skip__", False)):
      [INFO]      [exec]             # If the class or method was skipped.
      [INFO]      [exec]             try:
      [INFO]      [exec]                 skip_why = (getattr(self.__class__, '__unittest_skip_why__', '')
      [INFO]      [exec]                             or getattr(testMethod, '__unittest_skip_why__', ''))
      [INFO]      [exec]                 self._addSkip(result, skip_why)
      [INFO]      [exec]             finally:
      [INFO]      [exec]                 result.stopTest(self)
      [INFO]      [exec]             return
      [INFO]      [exec]         try:
      [INFO]      [exec]             success = False
      [INFO]      [exec]             try:
      [INFO]      [exec]                 self.setUp()
      [INFO]      [exec]             except SkipTest as e:
      [INFO]      [exec]                 self._addSkip(result, str(e))
      [INFO]      [exec]             except KeyboardInterrupt:
      [INFO]      [exec]                 raise
      [INFO]      [exec]             except:
      [INFO]      [exec]                 result.addError(self, sys.exc_info())
      [INFO]      [exec]             else:
      [INFO]      [exec]                 try:
      [INFO]      [exec]                     testMethod()
      [INFO]      [exec]                 except KeyboardInterrupt:
      [INFO]      [exec]                     raise
      [INFO]      [exec]                 except self.failureException:
      [INFO]      [exec]                     result.addFailure(self, sys.exc_info())
      [INFO]      [exec]                 except _ExpectedFailure as e:
      [INFO]      [exec]                     addExpectedFailure = getattr(result, 'addExpectedFailure', None)
      [INFO]      [exec]                     if addExpectedFailure is not None:
      [INFO]      [exec]                         addExpectedFailure(self, e.exc_info)
      [INFO]      [exec]                     else:
      [INFO]      [exec]                         warnings.warn("TestResult has no addExpectedFailure method, reporting as passes",
      [INFO]      [exec]                                       RuntimeWarning)
      [INFO]      [exec]                         result.addSuccess(self)
      [INFO]      [exec]                 except _UnexpectedSuccess:
      [INFO]      [exec]                     addUnexpectedSuccess = getattr(result, 'addUnexpectedSuccess', None)
      [INFO]      [exec]                     if addUnexpectedSuccess is not None:
      [INFO]      [exec]                         addUnexpectedSuccess(self)
      [INFO]      [exec]                     else:
      [INFO]      [exec]                         warnings.warn("TestResult has no addUnexpectedSuccess method, reporting as failures",
      [INFO]      [exec]                                       RuntimeWarning)
      [INFO]      [exec]                         result.addFailure(self, sys.exc_info())
      [INFO]      [exec]                 except SkipTest as e:
      [INFO]      [exec]                     self._addSkip(result, str(e))
      [INFO]      [exec]                 except:
      [INFO]      [exec]                     result.addError(self, sys.exc_info())
      [INFO]      [exec]                 else:
      [INFO]      [exec]                     success = True
      [INFO]      [exec]                 try:
      [INFO]      [exec] >                   self.tearDown()
      [INFO]      [exec] cleanUpSuccess = True
      [INFO]      [exec] orig_result = <TestCaseFunction 'test_local_watchdog_creation'>
      [INFO]      [exec] result     = <TestCaseFunction 'test_local_watchdog_creation'>
      [INFO]      [exec] self       = <tests.test_watchers.TestWatchers testMethod=test_local_watchdog_creation>
      [INFO]      [exec] success    = False
      [INFO]      [exec] testMethod = <bound method TestWatchers.test_local_watchdog_creation of <tests.test_watchers.TestWatchers testMethod=test_local_watchdog_creation>>
      [INFO]      [exec] ..\deploy-dir\drive-2.7.13-python\lib\unittest\case.py:358: 
      [INFO]      [exec] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
      [INFO]      [exec] self = <tests.test_watchers.TestWatchers testMethod=test_local_watchdog_creation>
      [INFO]      [exec]     def tearDown(self):
      [INFO]      [exec]         unittest.TestCase.tearDown(self)
      [INFO]      [exec]         if not self.tearedDown:
      [INFO]      [exec] >           self.tearDownApp()
      [INFO]      [exec] self       = <tests.test_watchers.TestWatchers testMethod=test_local_watchdog_creation>
      [INFO]      [exec] nuxeo-drive-client\tests\common_unit_test.py:549: 
      [INFO]      [exec] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
      [INFO]      [exec] self = <tests.test_watchers.TestWatchers testMethod=test_local_watchdog_creation>
      [INFO]      [exec] server_profile = None
      [INFO]      [exec]     def tearDownApp(self, server_profile=None):
      [INFO]      [exec]         if self.tearedDown:
      [INFO]      [exec]             return
      [INFO]      [exec]         if sys.exc_info() != (None, None, None):
      [INFO]      [exec] >           self.generate_report()
      [INFO]      [exec] self       = <tests.test_watchers.TestWatchers testMethod=test_local_watchdog_creation>
      [INFO]      [exec] server_profile = None
      [INFO]      [exec] nuxeo-drive-client\tests\common_unit_test.py:555: 
      [INFO]      [exec] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
      [INFO]      [exec] self = <tests.test_watchers.TestWatchers testMethod=test_local_watchdog_creation>
      [INFO]      [exec]     def generate_report(self):
      [INFO]      [exec]         if "REPORT_PATH" not in os.environ:
      [INFO]      [exec]             return
      [INFO]      [exec]         report_path = os.path.join(os.environ["REPORT_PATH"],
      [INFO]      [exec]                                    self.id() + '-' + sys.platform)
      [INFO]      [exec] >       self.manager_1.generate_report(report_path)
      [INFO]      [exec] report_path = 'C:\\Jenkins\\0ebd1d51\\workspace\\VE-808-mark-several-randoms-35HO5XFZKOQ3VQHN4MIJNNA2GK54WYXNG3QOGLZ33L2ZCHAO7QNA/sources\\tests.test_watchers.TestWatchers.test_local_watchdog_creation-win32'
      [INFO]      [exec] self       = <tests.test_watchers.TestWatchers testMethod=test_local_watchdog_creation>
      [INFO]      [exec] nuxeo-drive-client\tests\common_unit_test.py:678: 
      [INFO]      [exec] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
      [INFO]      [exec] self = <nxdrive.manager.Manager object at 0x7FC4D940>
      [INFO]      [exec] path = 'C:\\Jenkins\\0ebd1d51\\workspace\\VE-808-mark-several-randoms-35HO5XFZKOQ3VQHN4MIJNNA2GK54WYXNG3QOGLZ33L2ZCHAO7QNA/sources\\tests.test_watchers.TestWatchers.test_local_watchdog_creation-win32'
      [INFO]      [exec]     def generate_report(self, path=None):
      [INFO]      [exec]         from nxdrive.report import Report
      [INFO]      [exec]         report = Report(self, path)
      [INFO]      [exec] >       report.generate()
      [INFO]      [exec] Report     = <class 'nxdrive.report.Report'>
      [INFO]      [exec] path       = 'C:\\Jenkins\\0ebd1d51\\workspace\\VE-808-mark-several-randoms-35HO5XFZKOQ3VQHN4MIJNNA2GK54WYXNG3QOGLZ33L2ZCHAO7QNA/sources\\tests.test_watchers.TestWatchers.test_local_watchdog_creation-win32'
      [INFO]      [exec] report     = <nxdrive.report.Report object at 0x7481BBB0>
      [INFO]      [exec] self       = <nxdrive.manager.Manager object at 0x7FC4D940>
      [INFO]      [exec] nuxeo-drive-client\nxdrive\manager.py:826: 
      [INFO]      [exec] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
      [INFO]      [exec] self = <nxdrive.report.Report object at 0x7481BBB0>
      [INFO]      [exec]     def generate(self):
      [INFO]      [exec]         log.debug("Create report '%s'", self._report_name)
      [INFO]      [exec]         log.debug("Manager metrics: '%s'", self._manager.get_metrics())
      [INFO]      [exec]         with ZipFile(self._zipfile, 'w') as myzip:
      [INFO]      [exec]             dao = self._manager.get_dao()
      [INFO]      [exec]             self.copy_db(myzip, dao)
      [INFO]      [exec]             for engine in self._manager.get_engines().values():
      [INFO]      [exec]                 log.debug("Engine metrics: '%s'", engine.get_metrics())
      [INFO]      [exec]                 self.copy_db(myzip, engine.get_dao())
      [INFO]      [exec]                 # Might want threads too here
      [INFO]      [exec]             self.copy_logs(myzip)
      [INFO]      [exec] >           content = self._export_logs().encode('utf-8', errors='ignore')
      [INFO]      [exec] dao        = <nxdrive.engine.dao.sqlite.ManagerDAO object at 0x7FC4D990>
      [INFO]      [exec] engine     = <nxdrive.engine.engine.Engine object at 0x6CEE18F0>
      [INFO]      [exec] myzip      = <zipfile.ZipFile object at 0x7476AF90>
      [INFO]      [exec] self       = <nxdrive.report.Report object at 0x7481BBB0>
      [INFO]      [exec] nuxeo-drive-client\nxdrive\report.py:83: 
      [INFO]      [exec] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
      [INFO]      [exec]     @staticmethod
      [INFO]      [exec]     def _export_logs():
      [INFO]      [exec]         logs = u''
      [INFO]      [exec]         logger = get_logger(None)
      [INFO]      [exec]         handler = get_handler(logger, "memory")
      [INFO]      [exec]         log_buffer = handler.get_buffer(MAX_LOG_DISPLAYED)
      [INFO]      [exec]         for record in log_buffer:
      [INFO]      [exec]             try:
      [INFO]      [exec]                 log_ = handler.format(record).decode('utf-8', errors='replace')
      [INFO]      [exec]             except UnicodeEncodeError:
      [INFO]      [exec]                 log_ = handler.format(record).encode('utf-8', errors='replace')\
      [INFO]      [exec]                                              .decode('utf-8')
      [INFO]      [exec] >           logs += log_ + u'\n'
      [INFO]      [exec] E           MemoryError
      [INFO]      [exec] handler    = <nxdrive.logging_config.CustomMemoryHandler object at 0x0132C510>
      [INFO]      [exec] log_       = "2017-04-07 18:22:46,098 2764 3028 TRACE    nxdrive.engine.watcher.local_watcher Fetched FS children info for u'/'"
      [INFO]      [exec] log_buffer = [<logging.LogRecord object at 0x7476AD30>, <logging.LogRecord object at 0x7476AE10>, <logging.LogRecord object at 0x74...gRecord object at 0x7481B4F0>, <logging.LogRecord object at 0x7481B3D0>, <logging.LogRecord object at 0x7481B450>, ...]
      [INFO]      [exec] logger     = <logging.RootLogger object at 0x0205DC90>
      [INFO]      [exec] logs       = "2017-04-07 18:23:45,375 2764 3892 DEBUG    nxdrive.report     Engine metrics: '{'files_size': 0, 'sync_files': 0, 'sy...ms
      [INFO]      [exec] 2017-04-07 18:22:46,099 2764 3028 DEBUG    nxdrive.engine.watcher.local_watcher Ended recursive local scan of u'/'
      [INFO]      [exec] "
      [INFO]      [exec] record     = <logging.LogRecord object at 0x7C4D03F0>
      [INFO]      [exec] nuxeo-drive-client\nxdrive\report.py:69: MemoryError
      [INFO]      [exec] ================================== FAILURES ===================================
      [INFO]      [exec] __________________ TestWatchers.test_local_watchdog_creation __________________
      [INFO]      [exec] self = <tests.test_watchers.TestWatchers testMethod=test_local_watchdog_creation>
      [INFO]      [exec] result = <TestCaseFunction 'test_local_watchdog_creation'>
      [INFO]      [exec]     def run(self, result=None):
      [INFO]      [exec]         orig_result = result
      [INFO]      [exec]         if result is None:
      [INFO]      [exec]             result = self.defaultTestResult()
      [INFO]      [exec]             startTestRun = getattr(result, 'startTestRun', None)
      [INFO]      [exec]             if startTestRun is not None:
      [INFO]      [exec]                 startTestRun()
      [INFO]      [exec]         self._resultForDoCleanups = result
      [INFO]      [exec]         result.startTest(self)
      [INFO]      [exec]         testMethod = getattr(self, self._testMethodName)
      [INFO]      [exec]         if (getattr(self.__class__, "__unittest_skip__", False) or
      [INFO]      [exec]             getattr(testMethod, "__unittest_skip__", False)):
      [INFO]      [exec]             # If the class or method was skipped.
      [INFO]      [exec]             try:
      [INFO]      [exec]                 skip_why = (getattr(self.__class__, '__unittest_skip_why__', '')
      [INFO]      [exec]                             or getattr(testMethod, '__unittest_skip_why__', ''))
      [INFO]      [exec]                 self._addSkip(result, skip_why)
      [INFO]      [exec]             finally:
      [INFO]      [exec]                 result.stopTest(self)
      [INFO]      [exec]             return
      [INFO]      [exec]         try:
      [INFO]      [exec]             success = False
      [INFO]      [exec]             try:
      [INFO]      [exec]                 self.setUp()
      [INFO]      [exec]             except SkipTest as e:
      [INFO]      [exec]                 self._addSkip(result, str(e))
      [INFO]      [exec]             except KeyboardInterrupt:
      [INFO]      [exec]                 raise
      [INFO]      [exec]             except:
      [INFO]      [exec]                 result.addError(self, sys.exc_info())
      [INFO]      [exec]             else:
      [INFO]      [exec]                 try:
      [INFO]      [exec] >                   testMethod()
      [INFO]      [exec] cleanUpSuccess = True
      [INFO]      [exec] orig_result = <TestCaseFunction 'test_local_watchdog_creation'>
      [INFO]      [exec] result     = <TestCaseFunction 'test_local_watchdog_creation'>
      [INFO]      [exec] self       = <tests.test_watchers.TestWatchers testMethod=test_local_watchdog_creation>
      [INFO]      [exec] success    = False
      [INFO]      [exec] testMethod = <bound method TestWatchers.test_local_watchdog_creation of <tests.test_watchers.TestWatchers testMethod=test_local_watchdog_creation>>
      [INFO]      [exec] ..\deploy-dir\drive-2.7.13-python\lib\unittest\case.py:329: 
      [INFO]      [exec] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
      [INFO]      [exec] args = (<tests.test_watchers.TestWatchers testMethod=test_local_watchdog_creation>,)
      [INFO]      [exec] kwargs = {}
      [INFO]      [exec]     @wraps(func)
      [INFO]      [exec]     def _callable(*args, **kwargs):
      [INFO]      [exec]         # Handle specific OS
      [INFO]      [exec]         if self._os == 'linux' and not AbstractOSIntegration.is_linux():
      [INFO]      [exec] >           return func(*args, **kwargs)
      [INFO]      [exec] args       = (<tests.test_watchers.TestWatchers testMethod=test_local_watchdog_creation>,)
      [INFO]      [exec] func       = <function test_local_watchdog_creation at 0x036EAC30>
      [INFO]      [exec] kwargs     = {}
      [INFO]      [exec] self       = <tests.common_unit_test.RandomBug object at 0x02D67790>
      [INFO]      [exec] nuxeo-drive-client\tests\common_unit_test.py:98: 
      [INFO]      [exec] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
      [INFO]      [exec] self = <tests.test_watchers.TestWatchers testMethod=test_local_watchdog_creation>
      [INFO]      [exec]     @RandomBug('NXDRIVE-806', target='linux')
      [INFO]      [exec]     def test_local_watchdog_creation(self):
      [INFO]      [exec]         # Test the creation after first local scan
      [INFO]      [exec]         self.queue_manager_1.suspend()
      [INFO]      [exec]         self.queue_manager_1._disable = True
      [INFO]      [exec]         self.engine_1.start()
      [INFO]      [exec] >       self.wait_remote_scan()
      [INFO]      [exec] self       = <tests.test_watchers.TestWatchers testMethod=test_local_watchdog_creation>
      [INFO]      [exec] nuxeo-drive-client\tests\test_watchers.py:81: 
      [INFO]      [exec] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
      [INFO]      [exec] self = <tests.test_watchers.TestWatchers testMethod=test_local_watchdog_creation>
      [INFO]      [exec] timeout = 0, wait_for_engine_1 = True, wait_for_engine_2 = False
      [INFO]      [exec]     def wait_remote_scan(self, timeout=DEFAULT_WAIT_REMOTE_SCAN_TIMEOUT, wait_for_engine_1=True,
      [INFO]      [exec]                          wait_for_engine_2=False):
      [INFO]      [exec]         log.debug("Wait for remote scan")
      [INFO]      [exec]         self._wait_remote_scan = {self.engine_1.get_uid(): wait_for_engine_1,
      [INFO]      [exec]                                   self.engine_2.get_uid(): wait_for_engine_2}
      [INFO]      [exec]         while timeout > 0:
      [INFO]      [exec]             sleep(1)
      [INFO]      [exec]             if sum(self._wait_remote_scan.values()) == 0:
      [INFO]      [exec]                 log.debug("Ended wait for remote scan")
      [INFO]      [exec]                 return
      [INFO]      [exec]             timeout -= 1
      [INFO]      [exec] >       self.fail("Wait for remote scan timeout expired")
      [INFO]      [exec] self       = <tests.test_watchers.TestWatchers testMethod=test_local_watchdog_creation>
      [INFO]      [exec] timeout    = 0
      [INFO]      [exec] wait_for_engine_1 = True
      [INFO]      [exec] wait_for_engine_2 = False
      [INFO]      [exec] nuxeo-drive-client\tests\common_unit_test.py:478: 
      [INFO]      [exec] Traceback (most recent call last):
      [INFO]      [exec]   File "C:\Jenkins\0ebd1d51\workspace\VE-808-mark-several-randoms-35HO5XFZKOQ3VQHN4MIJNNA2GK54WYXNG3QOGLZ33L2ZCHAO7QNA\deploy-dir\drive-2.7.13-python\lib\runpy.py", line 174, in _run_module_as_main
      [INFO]      [exec]     "__main__", fname, loader, pkg_name)
      [INFO]      [exec]   File "C:\Jenkins\0ebd1d51\workspace\VE-808-mark-several-randoms-35HO5XFZKOQ3VQHN4MIJNNA2GK54WYXNG3QOGLZ33L2ZCHAO7QNA\deploy-dir\drive-2.7.13-python\lib\runpy.py", line 72, in _run_code
      [INFO]      [exec]     exec code in run_globals
      [INFO]      [exec]   File "C:\Jenkins\0ebd1d51\workspace\VE-808-mark-several-randoms-35HO5XFZKOQ3VQHN4MIJNNA2GK54WYXNG3QOGLZ33L2ZCHAO7QNA\deploy-dir\drive-2.7.13-python\pytest.py", line 17, in <module>
      [INFO]      [exec]     raise SystemExit(pytest.main())
      [INFO]      [exec]   File "C:\Jenkins\0ebd1d51\workspace\VE-808-mark-several-randoms-35HO5XFZKOQ3VQHN4MIJNNA2GK54WYXNG3QOGLZ33L2ZCHAO7QNA\deploy-dir\drive-2.7.13-python\_pytest\config.py", line 57, in main
      [INFO]      [exec]     return config.hook.pytest_cmdline_main(config=config)
      [INFO]      [exec]   File "C:\Jenkins\0ebd1d51\workspace\VE-808-mark-several-randoms-35HO5XFZKOQ3VQHN4MIJNNA2GK54WYXNG3QOGLZ33L2ZCHAO7QNA\deploy-dir\drive-2.7.13-python\_pytest\vendored_packages\pluggy.py", line 745, in __call__
      [INFO]      [exec]     return self._hookexec(self, self._nonwrappers + self._wrappers, kwargs)
      [INFO]      [exec]   File "C:\Jenkins\0ebd1d51\workspace\VE-808-mark-several-randoms-35HO5XFZKOQ3VQHN4MIJNNA2GK54WYXNG3QOGLZ33L2ZCHAO7QNA\deploy-dir\drive-2.7.13-python\_pytest\vendored_packages\pluggy.py", line 339, in _hookexec
      [INFO]      [exec]     return self._inner_hookexec(hook, methods, kwargs)
      [INFO]      [exec]   File "C:\Jenkins\0ebd1d51\workspace\VE-808-mark-several-randoms-35HO5XFZKOQ3VQHN4MIJNNA2GK54WYXNG3QOGLZ33L2ZCHAO7QNA\deploy-dir\drive-2.7.13-python\_pytest\vendored_packages\pluggy.py", line 334, in <lambda>
      [INFO]      [exec]     _MultiCall(methods, kwargs, hook.spec_opts).execute()
      [INFO]      [exec]   File "C:\Jenkins\0ebd1d51\workspace\VE-808-mark-several-randoms-35HO5XFZKOQ3VQHN4MIJNNA2GK54WYXNG3QOGLZ33L2ZCHAO7QNA\deploy-dir\drive-2.7.13-python\_pytest\vendored_packages\pluggy.py", line 614, in execute
      [INFO]      [exec]     res = hook_impl.function(*args)
      [INFO]      [exec]   File "C:\Jenkins\0ebd1d51\workspace\VE-808-mark-several-randoms-35HO5XFZKOQ3VQHN4MIJNNA2GK54WYXNG3QOGLZ33L2ZCHAO7QNA\deploy-dir\drive-2.7.13-python\_pytest\main.py", line 127, in pytest_cmdline_main
      [INFO]      [exec]     return wrap_session(config, _main)
      [INFO]      [exec]   File "C:\Jenkins\0ebd1d51\workspace\VE-808-mark-several-randoms-35HO5XFZKOQ3VQHN4MIJNNA2GK54WYXNG3QOGLZ33L2ZCHAO7QNA\deploy-dir\drive-2.7.13-python\_pytest\main.py", line 107, in wrap_session
      [INFO]      [exec]     config.hook.pytest_keyboard_interrupt(excinfo=excinfo)
      [INFO]      [exec]   File "C:\Jenkins\0ebd1d51\workspace\VE-808-mark-several-randoms-35HO5XFZKOQ3VQHN4MIJNNA2GK54WYXNG3QOGLZ33L2ZCHAO7QNA\deploy-dir\drive-2.7.13-python\_pytest\vendored_packages\pluggy.py", line 745, in __call__
      [INFO]      [exec]     return self._hookexec(self, self._nonwrappers + self._wrappers, kwargs)
      [INFO]      [exec]   File "C:\Jenkins\0ebd1d51\workspace\VE-808-mark-several-randoms-35HO5XFZKOQ3VQHN4MIJNNA2GK54WYXNG3QOGLZ33L2ZCHAO7QNA\deploy-dir\drive-2.7.13-python\_pytest\vendored_packages\pluggy.py", line 339, in _hookexec
      [INFO]      [exec]     return self._inner_hookexec(hook, methods, kwargs)
      [INFO]      [exec]   File "C:\Jenkins\0ebd1d51\workspace\VE-808-mark-several-randoms-35HO5XFZKOQ3VQHN4MIJNNA2GK54WYXNG3QOGLZ33L2ZCHAO7QNA\deploy-dir\drive-2.7.13-python\_pytest\vendored_packages\pluggy.py", line 334, in <lambda>
      [INFO]      [exec]     _MultiCall(methods, kwargs, hook.spec_opts).execute()
      [INFO]      [exec]   File "C:\Jenkins\0ebd1d51\workspace\VE-808-mark-several-randoms-35HO5XFZKOQ3VQHN4MIJNNA2GK54WYXNG3QOGLZ33L2ZCHAO7QNA\deploy-dir\drive-2.7.13-python\_pytest\vendored_packages\pluggy.py", line 614, in execute
      [INFO]      [exec]     res = hook_impl.function(*args)
      [INFO]      [exec]   File "C:\Jenkins\0ebd1d51\workspace\VE-808-mark-several-randoms-35HO5XFZKOQ3VQHN4MIJNNA2GK54WYXNG3QOGLZ33L2ZCHAO7QNA\deploy-dir\drive-2.7.13-python\_pytest\capture.py", line 144, in pytest_keyboard_interrupt
      [INFO]      [exec]     self.reset_capturings()
      [INFO]      [exec]   File "C:\Jenkins\0ebd1d51\workspace\VE-808-mark-several-randoms-35HO5XFZKOQ3VQHN4MIJNNA2GK54WYXNG3QOGLZ33L2ZCHAO7QNA\deploy-dir\drive-2.7.13-python\_pytest\capture.py", line 81, in reset_capturings
      [INFO]      [exec]     cap.pop_outerr_to_orig()
      [INFO]      [exec] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
      [INFO]      [exec] self = <tests.test_watchers.TestWatchers testMethod=test_local_watchdog_creation>
      [INFO]      [exec] msg = 'Wait for remote scan timeout expired'
      [INFO]      [exec]     def fail(self, msg=None):
      [INFO]      [exec]         """Fail immediately, with the given message."""
      [INFO]      [exec] >       raise self.failureException(msg)
      [INFO]      [exec] E       AssertionError: Wait for remote scan timeout expired
      [INFO]      [exec] msg        = 'Wait for remote scan timeout expired'
      [INFO]      [exec] self       = <tests.test_watchers.TestWatchers testMethod=test_local_watchdog_creation>
      [INFO]      [exec] ..\deploy-dir\drive-2.7.13-python\lib\unittest\case.py:410: AssertionError
      [INFO]      [exec] ======== 1 failed, 245 passed, 25 skipped, 1 error in 8145.72 seconds =========
      [INFO]      [exec]   File "C:\Jenkins\0ebd1d51\workspace\VE-808-mark-several-randoms-35HO5XFZKOQ3VQHN4MIJNNA2GK54WYXNG3QOGLZ33L2ZCHAO7QNA\deploy-dir\drive-2.7.13-python\_pytest\capture.py", line 275, in pop_outerr_to_orig
      [INFO]      [exec]     out, err = self.readouterr()
      [INFO]      [exec]   File "C:\Jenkins\0ebd1d51\workspace\VE-808-mark-several-randoms-35HO5XFZKOQ3VQHN4MIJNNA2GK54WYXNG3QOGLZ33L2ZCHAO7QNA\deploy-dir\drive-2.7.13-python\_pytest\capture.py", line 315, in readouterr
      [INFO]      [exec]     self.err.snap() if self.err is not None else "")
      [INFO]      [exec]   File "C:\Jenkins\0ebd1d51\workspace\VE-808-mark-several-randoms-35HO5XFZKOQ3VQHN4MIJNNA2GK54WYXNG3QOGLZ33L2ZCHAO7QNA\deploy-dir\drive-2.7.13-python\_pytest\capture.py", line 366, in snap
      [INFO]      [exec]     res = py.builtin._totext(res, enc, "replace")
      [INFO]      [exec]   File "C:\Jenkins\0ebd1d51\workspace\VE-808-mark-several-randoms-35HO5XFZKOQ3VQHN4MIJNNA2GK54WYXNG3QOGLZ33L2ZCHAO7QNA\deploy-dir\drive-2.7.13-python\lib\encodings\utf_8.py", line 16, in decode
      [INFO]      [exec]     return codecs.utf_8_decode(input, errors, True)
      [INFO]      [exec] MemoryError
      

      Also, use the deflate compression by default as it is very efficient on plain text files.

        Attachments

          Activity

            People

            • Votes:
              0 Vote for this issue
              Watchers:
              2 Start watching this issue

              Dates

              • Created:
                Updated:
                Resolved:

                Time Tracking

                Estimated:
                Original Estimate - Not Specified
                Not Specified
                Remaining:
                Remaining Estimate - 0 minutes
                0m
                Logged:
                Time Spent - 1 hour
                1h