linux - A python background process (= child of rsyslog process) ... only write on close -


below simple python program logs rsyslog data via stdin py.output.txt.
issue is doesn't log data stdin in realtime output.

if ishome.py runs background child process (of rsyslog) ... no output send py.output.txt when stop master process .. py.output.txt receives output

when background process

when terminate rsyslog send eof childprocess ishome.py .. , might trigger actual write out of data.

when foreground process

however when run ishome.py foreground process py.output.txt gets updated in realtime forever new entry. not need close ishome.py write out on every new event.

bash output

>>ps  root      4328     1  0 21:04 ?  00:00:00   /usr/sbin/rsyslogd -c5 root      4360  4328  1 21:04 ?  00:00:00     python/home/pi/script/ishome.py  >>pi@rasp ~/script $ cat py.output.txt  >>pi@rasp ~/script $ sudo service rsyslog stop [ ok ] stopping enhanced syslogd: rsyslogd. >>pi@rasp ~/script $ cat py.output.txt   2016-01-24 21:05:32.112457 :2016-01-24t22:04:22+00:00 192.168.0.198   2016-01-24 21:05:32.113029 :2016-01-24t22:04:33+00:00 192.168.0.198 

ishome.py

#!/usr/bin/env python  import sys datetime import datetime  filename = open("/home/pi/script/py.output.txt",'a',0) sys.stdout = filename   line in sys.stdin:     print (str(datetime.now())+' :'+line) 

first believed stdin buffered , stream processed @ closure. when @ time when stdin line processed, see stdin processed in realtime. write out .. not-happening ?

i've tested scenario hundreds of input lines written mongodb via pymongo . again db updated when process termated.

any idea's causing delay in writing, have every new event, been written in realtime output (be db or file).

indeed discovered flushing file solution. following code did trick ...link flush file example


Comments