Custom Query (101 matches)

Filters
 
Or
 
  
 
Columns

Show under each result:


Results (31 - 33 of 101)

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21
Ticket Resolution Summary Owner Reporter
#15 wontfix incremental/diff based job reporting somebody anonymous
Description

Only inject metrics when changes occur in the joblist, to improve performance and reduce overhead.

I.e.: <GMETRIC NAME="+MONARCH-JOB-<ID>" VALUE="<stats>"> <GMETRIC NAME="-MONARCH-JOB-<ID>" VALUE="<stats>">

Or something similar. Perhaps a state file would be usefull, for example for the jobarchived?

#164 fixed job archive database housekeeping at intervals ramonb ramonb
Description

when jobarchived is started, a check is performed for any stale or out dated information in the database. For example jobs whose ( start time + requested time ) > current time. Those jobs are then 'closed' in the database.

This can happen when jobs started running while jobarchived is running, but finished when jobarchived was no longer running.

Theoretically this can still occur while jobarchived has not been stopped, but if for example jobmond has stopped running.

The housekeeping checks done at startup of jobarchived should also be performed at an regular interval, to prevent any dummy information in the database. This only becomes an issue for very big systems and after long periods.

#171 fixed jobarchived crashed after 20 interation XML parsing due to exception ramonb oufei.zhao@…
Description

In line 914 of jobarchived.py, Null checking on timeout_jobs should be added before accessing it. It will throw exception when timeout_jobs is 'None'. Once add a line to check null, it works fine.

if timedout_jobs != None: <== added

for j in timedout_jobs:

del self.jobAttrs[ j ] del self.jobAttrsSaved[ j ]

See below for log and stack trace: Mon 24 Jun 2013 18:00:59 - job_xml_thread(): Retrieving XML data.. Mon 24 Jun 2013 18:00:59 - job_xml_thread(): Done retrieving: data size 2656 Mon 24 Jun 2013 18:00:59 - job_xml_thread(): Parsing XML.. Mon 24 Jun 2013 18:00:59 - XML: Start document: iteration 20 Mon 24 Jun 2013 18:00:59 - XML: Processed 2 elements - found 0 jobs Mon 24 Jun 2013 18:00:59 - self.heartbeat = 0 Mon 24 Jun 2013 18:00:59 - job_xml_thread(): Done parsing. Mon 24 Jun 2013 18:00:59 - job_xml_thread(): Sleeping.. (15s) Mon 24 Jun 2013 18:01:04 - job_xml_thread(): Retrieving XML data.. Mon 24 Jun 2013 18:01:04 - job_xml_thread(): Done retrieving: data size 2656 Mon 24 Jun 2013 18:01:04 - job_xml_thread(): Parsing XML.. Mon 24 Jun 2013 18:01:04 - Housekeeping: checking database for timed out jobs.. Mon 24 Jun 2013 18:01:04 - doDatabase(): get: SELECT * from jobs WHERE job_status != 'F' Mon 24 Jun 2013 18:01:04 - doDatabase(): result: [] Exception in thread job_proc_thread: Traceback (most recent call last):

File "/usr/local/lib/python2.4/threading.py", line 442, in bootstrap

self.run()

File "/usr/local/lib/python2.4/threading.py", line 422, in run

self.target(*self.args, self.kwargs)

File "/usr/sbin/jobarchived", line 870, in run

xml.sax.parseString( my_data, self.myXMLHandler, self.myXMLError )

File "/usr/local/lib/python2.4/xml/sax/init.py", line 49, in parseString

parser.parse(inpsrc)

File "/usr/local/lib/python2.4/xml/sax/expatreader.py", line 107, in parse

xmlreader.IncrementalParser?.parse(self, source)

File "/usr/local/lib/python2.4/xml/sax/xmlreader.py", line 123, in parse

self.feed(buffer)

File "/usr/local/lib/python2.4/xml/sax/expatreader.py", line 200, in feed

self._cont_handler.startDocument()

File "/usr/sbin/jobarchived", line 914, in startDocument

for j in timedout_jobs:

TypeError?: iteration over non-sequence

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21
Note: See TracQuery for help on using queries.