From 6dd2c2e01c1c111ecaa1ee615f86b17f4922f900 Mon Sep 17 00:00:00 2001 From: Nikolay Kichukov Date: Sat, 23 Jun 2018 23:21:27 +0200 Subject: [PATCH 1/3] Includes multiple bug-fixes with kodi->tvheadend4.2(current stable) mapping script: map_to_hts.py, code cleanup (to a minimum) and introduction of some additional features: update_channels.py, updated README.md so it includes detailed steps for the setup, a working epg_fetch_upload.sh script to import the external EPG and changes slider values for configuring the timeout, ie allows interaction with slower/overloaded API server, limit max timeout to 3 minutes and slider stepping changed to 1 second. Minimum value set to 1 second. --- README.md | 42 ++- epg_fetch_upload.sh | 3 + map_to_hts.py | 671 +++++++++++++++++++++++--------------------- settings.xml | 18 ++ update_channels.py | 116 ++++++++ 5 files changed, 529 insertions(+), 321 deletions(-) create mode 100755 epg_fetch_upload.sh create mode 100644 settings.xml create mode 100755 update_channels.py diff --git a/README.md b/README.md index 2af88e4..f40eb35 100644 --- a/README.md +++ b/README.md @@ -1,14 +1,44 @@ plugin.program.bscfusion ====================== -It is not official addon from provider. -It was made just for fun. -You were warned. +It is not official addon from provider. +It was made just for fun. +You were warned. Plugin can be installed via repo: https://github.com/kodi1/kodi1.github.io/releases/download/v0.0.1/repo.bg.plugins.zip -You need a tvheadend running. +How it works: -Run ./map_to_hts.py localhost localhost +1.) Install kodi (Tested to be working with kodi 17.6 (current stable)). +2.) Install tvheadend server (hts) (Tested to be working with version 4.2.4 (current stable)). +3.) Configure tvheadend server so it is ready for the integration: + - By default, connecting to tvheadend web API will not prompt for credentials. + - Enable the epggrabber module XMLTV: Configuration / Channel/EPG / EPG Grabber Modules -> and enable External: XMLTV. + - Locate the xmltv.sock file that got created, for example on Gentoo Linux this falls under: /etc/tvheadend/epggrab/xmltv.sock, adjust filesystem ACLs if needed so you can write to it. + - Obtain the external EPG, on Linux run: ```wget -O epg.xml.gz http://epg.kodibg.org/dl.php``` + - Upload the EPG into tvheadend via the xmltv socket interface, on Linux run (with root user or another one that has write access to the socket): + ```zcat epg.xml.gz | nc -q 1 -U /etc/tvheadend/epggrab/xmltv.sock``` OR + run ```./epg_fetch_upload.sh``` script. + (prerequisite is to have netcat/ncat installed, or use alternative tool that allows you to write to unix sockets) + +4.) At this point you are ready to install this plugin in your kodi from the repository provided above. There are instructions on how to do this in the forum, but it boils down to: dowloading the .zip file to the kodi device and going to Addons / My Addons / Install from file. Once installed it will show up under Addons / Services, then you can configure it with all necessary settings, including your Bulsatcom IPTV credentials. +5.) To see if the plugin works, manually load http://:8888/dumpch, ie http://127.0.0.1:8888/dumpch. This needs to generate a json list of all the available channels under your subscription. If this works, you are ready to proceed with the tvheadend mapping and integration. If it does not work, you should fix this before proceeding(see bottom). +6.) Run ```./map_to_hts.py [hts hostname/ip] [kodi hostname/ip]``` + * kodi and tvheadend can be on different devices and IP addresses, but running them on 'localhost' may be the most standard use case. + * This script requires python version 2.7 to be installed. It does not have to be the default python interpreter on the system, but needs to be available. If you have problems, just remove the new version 3.x and try running the script again or type: ```python2.7 map_to_hts.py [parameters*]```. + * It may take a while for the muxes creation and services mapping to happen, be patient. This version of the script, runs 5 muxes at the same time for 15 seconds and then spawns another 5. On decent hardware and connection, this should be totally fine. -Use Tvheadend PVR client addon for Kodi +7.) Run ```./update_channels.py [hts hostname/ip]``` + * This assigns tags to the channels, links them to the previously installed EPG and puts icons where a mapping between the external EPG and Bulsatcom exists. + * Same as previous script, requires python version 2.7 to be available on the system. + +8.) Install tvheadend PVR client addon and configure it for the tvheadend IP/FQDN and port. +9.) Go to TV section and enjoy watching your subscription on this amazing setup! + +You can also watch on your iPhone/iPad using TvhClient, requires both kodi and tvheadend to be up and running. + +If something goes wrong, consult kodi log under ~/.kodi/temp/. It can show you where an obvious problem is. +Second, use the forums, perhaps others already experience it and figured out a way to resolve it. If you cannot find your problem, ask in the forum. +Last, create a bug report in github. + +Happy watching! diff --git a/epg_fetch_upload.sh b/epg_fetch_upload.sh new file mode 100755 index 0000000..48c590e --- /dev/null +++ b/epg_fetch_upload.sh @@ -0,0 +1,3 @@ +#!/bin/bash +wget -O epg.xml.gz http://epg.kodibg.org/dl.php +cat epg.xml.gz | nc -q 1 -U /etc/tvheadend/epggrab/xmltv.sock diff --git a/map_to_hts.py b/map_to_hts.py index d60518b..367c679 100755 --- a/map_to_hts.py +++ b/map_to_hts.py @@ -1,24 +1,37 @@ -#!/usr/bin/python +#!/usr/bin/python2.7 # -*- coding: utf8 -*- +#what this does: +#creates tvheadend net, muxes which automap into services and +#then maps the services into channels and assigns privider tags +#all from the bsc kodi plugin API +#gives steps on how to get EPG and Icons added to channels +#Use at your own risk! import os, sys from time import sleep try: - import requests + import requests except: - p = os.path.join(os.getcwd(), '..', 'script.module.requests', 'lib') - sys.path.insert(0, p) - import requests - pass + p = os.path.join(os.getcwd(), '..', 'script.module.requests', 'lib') + sys.path.insert(0, p) + import requests + pass try: - import simplejson as json + import simplejson as json except: - p = os.path.join(os.getcwd(), '..', 'script.module.simplejson', 'lib') - sys.path.insert(0, p) - import simplejson as json - pass + p = os.path.join(os.getcwd(), '..', 'script.module.simplejson', 'lib') + sys.path.insert(0, p) + import simplejson as json + pass + +def which(ff): + for p in os.environ["PATH"].split(os.pathsep): + pp = os.path.join(p, ff) + if os.path.exists(pp): + return pp + return None __delay = 1 create_mux = 'api/mpegts/network/mux_create' @@ -29,332 +42,360 @@ get_service = 'api/mpegts/service/grid' get_tags = 'api/channeltag/grid' map_all = 'api/service/mapper/start' +map_channels = 'api/service/mapper/save' +channel_list = 'api/channel/list' +ffmpegstr = which('ffmpeg')+ ' -user-agent "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/64.0.3282.140 Safari/537.36" -loglevel fatal -i http://KODI/id/CHANNEL -vcodec copy -acodec copy -metadata service_provider=bsc -metadata service_name=CHANNEL -f mpegts pipe:1' _headers = { - 'User-Agent': 'Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:48.0) Gecko/20100101 Firefox/48.0', - 'Accept': 'text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8', - 'Accept-Language': 'en-US,en;q=0.5', - 'Accept-Encoding': 'gzip, deflate', - 'X-Requested-With': 'XMLHttpRequest', - 'Content-Type': 'application/x-www-form-urlencoded; charset=UTF-8', - 'Connection': 'keep-alive' + 'User-Agent': 'Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:48.0) Gecko/20100101 Firefox/48.0', + 'Accept': 'text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8', + 'Accept-Language': 'en-US,en;q=0.5', + 'Accept-Encoding': 'gzip, deflate', + 'X-Requested-With': 'XMLHttpRequest', + 'Content-Type': 'application/x-www-form-urlencoded; charset=UTF-8', + 'Connection': 'keep-alive' } net_create_data = { - 'class': 'iptv_network', - 'conf': '' + 'class': 'iptv_network', + 'conf': '' } load_data = { - 'class': 'mpegts_network', - 'enum': '1', - 'query': '', + 'class': 'mpegts_network', + 'enum': '1', + 'query': '', } grid_list = { - 'sort': 'name', - 'dir': 'ASC', - 'start': 0, - 'limit': 999999999 + 'sort': 'name', + 'dir': 'ASC', + 'start': 0, + 'limit': 999999999 } -def which(ff): - for p in os.environ["PATH"].split(os.pathsep): - pp = os.path.join(p, ff) - if os.path.exists(pp): - return pp - return None - - def have_net(e, net): - for x in e.get('entries', []): - key = x.get('key', None) - if key is not None and net == x.get('val', None): - return key + for x in e.get('entries', []): + key = x.get('key', None) + if key is not None and net == x.get('val', None): + return key - return None + return None def disable_auto_check_service(hts_conn): - r = hts_conn.post('%s/%s' % (url, get_service), headers=_headers, data=grid_list) - if r.status_code == requests.codes.ok: - for x in r.json().get('entries', []): - _d = { - 'enabled': True, - 'auto': 1, - 'channel': [], - 'priority': 0, - 'dvb_ignore_eit': False, - 'charset': 'AUTO', - 'prefcapid': 0, - 'prefcapid_lock': 0, - 'force_caid': '0x0', - 'uuid': x['uuid'] - } - - r = hts_conn.post('%s/%s' % (url, save_node), headers=_headers, data={'node': json.dumps(_d)}) - if r.status_code != requests.codes.ok: - print 'Error: %s -> %d' % (r.url, r.status_code) - sys.exit('Error auto disabele') - else: - print 'Error: %s -> %d' % (r.url, r.status_code) - sys.exit('Error list auto disabele') + r = hts_conn.post('%s/%s' % (url, get_service), headers=_headers, data=grid_list) + if r.status_code == requests.codes.ok: + for x in r.json().get('entries', []): + #print ("in disable_auto_check_service x is: " + json.dumps(x, indent=4 * ' ')) + _d = { + 'enabled': True, + 'auto': 1, + #'channel': [], + 'priority': 0, + 'dvb_ignore_eit': False, + 'charset': 'AUTO', + 'prefcapid': 0, + 'prefcapid_lock': 0, + 'force_caid': '0x0', + 'uuid': x['uuid'] + } + + r = hts_conn.post('%s/%s' % (url, save_node), headers=_headers, data={'node': json.dumps(_d)}) + if r.status_code != requests.codes.ok: + print 'Error: %s -> %d' % (r.url, r.status_code) + sys.exit('Error auto disabele') + else: + print 'Error: %s -> %d' % (r.url, r.status_code) + sys.exit('Error list auto disabele') def add_mux(hts_conn, ch): - _n = { - 'enabled': True, - 'epg': 1, - 'scan_state': 0, - 'pmt_06_ac3': 0, - 'iptv_url': ch['mux_url'], - 'iptv_interface':'', - 'iptv_atsc': False, - 'iptv_muxname' : ch['mux_name'], - 'iptv_sname': ch['mux_name'], - 'charset': 'AUTO', - 'priority': 0, - 'spriority': 0, - 'iptv_respawn': True, - 'iptv_env': '' - } - _mux = { - 'uuid': uuid, - 'conf': json.dumps(_n) - } - r = hts_conn.post('%s/%s' % (url, create_mux), headers=_headers, data=_mux) - if r.status_code != requests.codes.ok: - print '%s\nError:\n%s' % (ch['mux_name'], r.content) - sys.exit('Error list chanels') - #else: - #print 'Add: %s' % ch['mux_name'] + _kodi_params = ch['mux_url'].split() + _ffmpegstr = ffmpegstr.replace("KODI", _kodi_params[1]) + _ffmpegstr = _ffmpegstr.replace("CHANNEL", _kodi_params[2]) + #print (_ffmpegstr) + _n = { + 'enabled': True, + 'epg': 1, + 'scan_state': 0, + 'pmt_06_ac3': 0, + #'iptv_url': ch['mux_url'], + #let's use dynamic pipe, ffmpeg, so that we are independent on + #the location of the dumper template in the source (kodi) + #makes sense when kodi and HTS are on different machines + 'iptv_url': 'pipe://'+_ffmpegstr, + 'iptv_interface':'', + 'iptv_atsc': False, + 'iptv_muxname' : ch['mux_name'], + 'iptv_sname': ch['mux_name'], + 'charset': 'AUTO', + 'priority': 0, + 'spriority': 0, + 'iptv_respawn': True, + 'iptv_env': '', + 'channel_number': ch['ch_idx'] #dobaviame nomer na kanal v mux + } + _mux = { + 'uuid': uuid, + 'conf': json.dumps(_n) + } + r = hts_conn.post('%s/%s' % (url, create_mux), headers=_headers, data=_mux) + if r.status_code != requests.codes.ok: + print '%s\nError:\n%s' % (ch['mux_name'], r.content) + sys.exit('Error list channels') + #else: + #print 'Add: %s' % ch['mux_name'] def wait_mux(hts_conn): - sleep(__delay) - r = hts_conn.post('%s/%s' % (url, get_service), headers=_headers, data=grid_list) - if r.status_code == requests.codes.ok: - return len(r.json().get('entries', [])) - else: - print 'Error: %s -> %d' % (r.url, r.status_code) - sys.exit('Error list chanels') + sleep(__delay) + r = hts_conn.post('%s/%s' % (url, get_service), headers=_headers, data=grid_list) + if r.status_code == requests.codes.ok: + return len(r.json().get('entries', [])) + else: + print 'Error: %s -> %d' % (r.url, r.status_code) + sys.exit('Error list channels') def add_tag(hts_conn, tag): - r = hts_conn.post('%s/api/channeltag/list' % url, headers=_headers) - if r.status_code != requests.codes.ok: - print 'Error: %s -> %d' % (r.url, r.status_code) - sys.exit('Error list chanels') - - for k in r.json().get('entries', []): - if tag == k.get('val'): - return k.get('key') - - _n = { - 'enabled': True, - 'index': 0, - 'name': tag, - 'internal': False, - 'private': False, - 'icon': '', - 'titled_icon': False, - 'comment': tag - } - r = hts_conn.post('%s/api/channeltag/create' % url, headers=_headers, data={'conf' : json.dumps(_n)}) - if r.status_code != requests.codes.ok: - print 'Error: %s -> %d' % (r.url, r.status_code) - sys.exit('Error list chanels') - - r = hts_conn.post('%s/api/channeltag/list' % url, headers=_headers) - if r.status_code != requests.codes.ok: - print 'Error: %s -> %d' % (r.url, r.status_code) - sys.exit('Error list url') - - for k in r.json().get('entries', []): - if tag == k.get('val'): - #print 'Tag created %s:\n%s' % (k.get('val'), k.get('key')) - return k.get('key') - - sys.exit('Error tag') - -def epg_chanels_edit(hts_conn, bch): - r = hts_conn.get('http://epg.kodibg.org/dlmap.php') - if r.status_code != requests.codes.ok: - sys.exit('Error map') - - map_epg = r.json() - - r = hts_conn.post('%s/api/epggrab/channel/list' % url, headers=_headers) - if r.status_code != requests.codes.ok: - print 'Error: %s -> %d' % (r.url, r.status_code) - sys.exit('Error list epg/xml') - - epg_list = r.json().get('entries', []) - - r = hts_conn.post('%s/api/channeltag/list' % url, headers=_headers) - if r.status_code != requests.codes.ok: - print 'Error: %s -> %d' % (r.url, r.status_code) - sys.exit('Error list tag') - - tag_list = r.json().get('entries', []) - - r = hts_conn.post('%s/api/channel/list' % url, headers=_headers) - if r.status_code != requests.codes.ok: - print 'Error: %s -> %d' % (r.url, r.status_code) - sys.exit('Error list epg/chanels') - - all_ch = r.json().get('entries', []) - n = len(all_ch) - for ch in all_ch: - for x in bch: - if ch.get('val') == x.get('mux_name'): - r = hts_conn.post('%s/%s' % (url, load_node), headers=_headers, data={'uuid': ch['key'],'meta': 1}) - if r.status_code != requests.codes.ok: - print 'Error: %s -> %d' % (r.url, r.status_code) - sys.exit('Error list epg/chanels') - - for t in tag_list: - if t['val'] == x['tag']: - break - - ee = [] - for e in epg_list: - if e['key'].split('|')[1] == map_epg.get(x['mux_name'], {'id': x['mux_name']})['id']: - ee = [e['key']] - #if e['text'].split(':')[0] == map_epg.get(x['mux_name'], {'id': x['mux_name']})['id']: - #ee = [e['uuid']] - break - - update = {'uuid': r.json()['entries'][0]['uuid']} - - p = r.json()['entries'][0]['params'] - for i in [0, 1, 5, 6, 7, 8, 9, 10]: - #for i in [0, 1, 5, 6, 7, 8, 9, 10, 12]: - #print i, x.get('title') - update[p[i]['id']] = p[i]['value'] - - update['name'] = x['title'] - update['epggrab'] = ee - update['tags'].append(t['key']) - update['number'] = x['ch_idx'] - #print json.dumps(update, indent=4 * ' ') - - r = hts_conn.post('%s/%s' % (url, save_node), headers=_headers, data={"node": '%s' % json.dumps(update)}) - if r.status_code != requests.codes.ok: - print 'Error: %s -> %d' % (r.url, r.status_code) - sys.exit('Error list chanels') - n -= 1 - - e = None - if len(ee): - e = ee[0].split('|')[1] - - print ("Name: %s Epg: %s Cat: %s - %d" % (x.get('title'), e, t['val'], n)).encode('utf8') - sleep(0.5) - break + r = hts_conn.post('%s/api/channeltag/list' % url, headers=_headers) + if r.status_code != requests.codes.ok: + print 'Error: %s -> %d' % (r.url, r.status_code) + sys.exit('Error list channels') + + for k in r.json().get('entries', []): + if tag == k.get('val'): + return k.get('key') + + _n = { + 'enabled': True, + 'index': 0, + 'name': tag.encode('utf8'), + 'internal': False, + 'private': False, + 'icon': '', + 'titled_icon': False, + 'comment': tag + } + r = hts_conn.post('%s/api/channeltag/create' % url, headers=_headers, data={'conf' : json.dumps(_n)}) + if r.status_code != requests.codes.ok: + print 'Error: %s -> %d' % (r.url, r.status_code) + sys.exit('Error list channels') + + r = hts_conn.post('%s/api/channeltag/list' % url, headers=_headers) + if r.status_code != requests.codes.ok: + print 'Error: %s -> %d' % (r.url, r.status_code) + sys.exit('Error list url') + + for k in r.json().get('entries', []): + if tag == k.get('val'): + #print 'Tag created %s:\n%s' % (k.get('val'), k.get('key')) + return k.get('key') + + sys.exit('Error tag') + +def update_tag_in_channels(hts_conn, bch): + r = hts_conn.post('%s/api/channeltag/list' % url, headers=_headers) + if r.status_code != requests.codes.ok: + print 'Error: %s -> %d' % (r.url, r.status_code) + sys.exit('Error list tag') + tag_list = r.json().get('entries', []) + + r = hts_conn.post('%s/api/channel/list' % url, headers=_headers) + if r.status_code != requests.codes.ok: + print 'Error: %s -> %d' % (r.url, r.status_code) + sys.exit('Error list epg/channels') + all_ch = r.json().get('entries', []) + + for ch in all_ch: + for x in bch: + if ch.get('val') == x.get('mux_name'): + r = hts_conn.post('%s/%s' % (url, load_node), headers=_headers, data={'uuid': ch['key'],'meta': 1}) + if r.status_code != requests.codes.ok: + print 'Error: %s -> %d' % (r.url, r.status_code) + sys.exit('Error list epg/channels') + + for t in tag_list: + if t['val'] == x['tag']: + break + + update = {'uuid': r.json()['entries'][0]['uuid']} + p = r.json()['entries'][0]['params'] + #name, number and tag correspond to those offsets in tvheadend 4.3+ + #for i in [2, 3, 13]: + #name, number and tag correspond to those offsets in tvheadend 4.1 and 4.2 + for i in [2, 3, 12]: + #print "p i 'id' : " + p[i]['id'] + #print "p i 'value' : " + p[i]['value'] + update[p[i]['id']] = p[i]['value'] + + #update['name'] = x['title'] + update['tags'].append(t['key']) + #update['number'] = x['ch_idx'] + + r = hts_conn.post('%s/%s' % (url, save_node), headers=_headers, data={"node": '%s' % json.dumps(update)}) + if r.status_code != requests.codes.ok: + print 'Error: %s -> %d' % (r.url, r.status_code) + sys.exit('Error list channels') + +def map_services_to_channels(hts_conn): + services_uids = [] + match = False + r = hts_conn.post('%s/%s' % (url, channel_list), headers=_headers) + if r.status_code != requests.codes.ok: + print 'Error: %s -> %d' % (r.url, r.status_code) + sys.exit('Error list channels') + all_ch = r.json().get('entries', []) + + r = hts_conn.post('%s/%s' % (url, get_service), headers=_headers, data=grid_list) + if r.status_code == requests.codes.ok: + services_list = r.json().get('entries', []) + for _service in services_list: + if len(_service['channel']) > 0: + for _ch in all_ch: + if _ch['key'] == _service['channel'][0]: + match = True + break + if match is False: + services_uids.append(_service['uuid']) + match = False + + print json.dumps(services_uids) + else: + print 'Error: %s -> %d' % (r.url, r.status_code) + sys.exit('Error list services') + + _data_node = { + "services": services_uids, + "encrypted": True, + "merge_same_name": False, + "check_availability": False, + "type_tags": False, + "provider_tags": False, + "network_tags": False + } + r = hts_conn.post('%s/%s' % (url, map_channels), headers=_headers, data={"node": '%s' % json.dumps(_data_node)}) + if r.status_code != requests.codes.ok: + print 'Error: %s -> %d' % (r.url, r.status_code) + sys.exit('Error mapping channels') if __name__ == '__main__': - in_list = [] - if len(sys.argv) != 3: - sys.exit('\nWrong parameters\nusage: %s [hts hostname/ip] [kodi hostname/ip]\n' % (sys.argv[0])) - - if which('ffmpeg') is None: - sys.exit('\nffmpeg not found') - - hts_conn = requests.Session() - - r = hts_conn.get('http://%s:8888/dumpch' % sys.argv[2]) - if r.status_code != requests.codes.ok: - sys.exit('Error Connection') - - lst_all = r.json().get('list', []) - for ch in lst_all: - if ch not in in_list: - in_list.append(ch) - else: - print 'Skip duplicate entry: %s' % str(ch) - - net_name = r.json().get('service', 'Unamed') - _headers['Host'] = '%s:9981' % sys.argv[1] - url = 'http://%s' % _headers['Host'] - _headers['Referer'] = '%s/extjs.html?' % url - - #epg_chanels_edit(hts_conn, in_list) - #disable_auto_check_service(hts_conn) - #sys.exit() - - r = hts_conn.get('%s/extjs.html?' % url, headers=_headers) - if r.status_code == requests.codes.ok: - r = hts_conn.post('%s/%s' % (url, load_node), headers=_headers, data=load_data) - if r.status_code == requests.codes.ok: - uuid = have_net(r.json(), net_name) - if uuid is None: - print 'Create network %s' % net_name - _c = { - 'networkname': net_name, - 'autodiscovery': False, - 'skipinitscan': True, - 'id_chnum': False, - 'ignore_chnum': False, - 'max_streams': 0, - 'max_bandwidth': 0, - 'max_timeout': 15, - 'nid': 0, - 'idlescan': False, - 'charset': 'AUTO', - 'localtime': False, - 'priority': 1, - 'spriority': 1 - } - - _d = {'class': 'iptv_network', 'conf': json.dumps(_c)} - - r = hts_conn.post('%s/%s' % (url, create_net), headers=_headers, data=_d) - if r.status_code == requests.codes.ok: - r = hts_conn.post('%s/%s' % (url, load_node), headers=_headers, data=load_data) - if r.status_code == requests.codes.ok: - uuid = have_net(r.json(), net_name) - print 'Net created %s uuid %s' % (net_name, uuid) - else: - print 'Error: %s -> %d' % (r.url, r.status_code) - sys.exit('Error list chanels') - else: - print 'Error: %s -> %d' % (r.url, r.status_code) - sys.exit('Error list chanels') - - print 'Net %s uuid %s' % (net_name, uuid) - r = hts_conn.post('%s/%s' % (url, load_mux), headers=_headers, data=grid_list) - if r.status_code == requests.codes.ok: - for x in r.json().get('entries', []): - for y in in_list: - if x['name'] == y['mux_name']: - print 'Skip: %s at %s' % (y['mux_name'], y['mux_url']) - in_list.remove(y) - else: - sys.exit('Error list chanels') - - num = len(in_list) - #for ch in in_list[30:35]: - for ch in in_list: - active = wait_mux(hts_conn) - add_mux(hts_conn, ch) - for t in range(0, 30 + __delay, __delay): - _a = wait_mux(hts_conn) - if _a != active: - active = _a - break - num -= 1 - print ('Active %s - %d left' % (ch['mux_name'], num)).encode('utf8') - add_tag(hts_conn, ch.get('tag', 'empty')) - - disable_auto_check_service(hts_conn) - - r = hts_conn.post('%s/%s' % (url, map_all), headers=_headers) - if r.status_code != requests.codes.ok: - print 'Error: %s -> %d' % (r.url, r.status_code) - sys.exit('Error map all') - - sleep(2) - epg_chanels_edit(hts_conn, lst_all) - else: - print 'Error: %s -> %d' % (r.url, r.status_code) - sys.exit('Error list chanels') - else: - print 'Login Error:\n%s' % r.content - sys.exit('Error list chanels') + in_list = [] + if len(sys.argv) != 3: + sys.exit('\nWrong parameters\nusage: %s [hts hostname/ip] [kodi hostname/ip]\n' % (sys.argv[0])) + + if which('ffmpeg') is None: + sys.exit('\nffmpeg not found') + + hts_conn = requests.Session() + + r = hts_conn.get('http://%s:8888/dumpch' % sys.argv[2]) + if r.status_code != requests.codes.ok: + sys.exit('Error Connection') + + lst_all = r.json().get('list', []) + for ch in lst_all: + if ch not in in_list: + in_list.append(ch) + else: + print 'Skip duplicate entry: %s' % str(ch) + + net_name = r.json().get('service', 'Unamed') + _headers['Host'] = '%s:9981' % sys.argv[1] + url = 'http://%s' % _headers['Host'] + _headers['Referer'] = '%s/extjs.html?' % url + + r = hts_conn.get('%s/extjs.html?' % url, headers=_headers) + if r.status_code == requests.codes.ok: + r = hts_conn.post('%s/%s' % (url, load_node), headers=_headers, data=load_data) + if r.status_code == requests.codes.ok: + uuid = have_net(r.json(), net_name) + if uuid is None: + print 'Create network %s' % net_name + _c = { + 'networkname': net_name, + 'autodiscovery': False, + 'skipinitscan': True, + 'id_chnum': False, + 'ignore_chnum': False, + 'max_streams': 0, + 'max_bandwidth': 0, + 'max_timeout': 15, + 'nid': 0, + 'idlescan': False, + 'charset': 'AUTO', + 'localtime': False, + 'priority': 1, + 'spriority': 1 + } + + _d = {'class': 'iptv_network', 'conf': json.dumps(_c)} + + r = hts_conn.post('%s/%s' % (url, create_net), headers=_headers, data=_d) + if r.status_code == requests.codes.ok: + r = hts_conn.post('%s/%s' % (url, load_node), headers=_headers, data=load_data) + if r.status_code == requests.codes.ok: + uuid = have_net(r.json(), net_name) + print 'Net created %s uuid %s' % (net_name, uuid) + else: + print 'Error: %s -> %d' % (r.url, r.status_code) + sys.exit('Error list channels') + else: + print 'Error: %s -> %d' % (r.url, r.status_code) + sys.exit('Error list channels') + + print 'Net %s uuid %s' % (net_name, uuid) + r = hts_conn.post('%s/%s' % (url, load_mux), headers=_headers, data=grid_list) + if r.status_code == requests.codes.ok: + for x in r.json().get('entries', []): + for y in in_list: + if x['name'] == y['mux_name']: + print 'Skip: %s at %s' % (y['mux_name'], y['mux_url']) + in_list.remove(y) #remove it if mux already exists + else: + sys.exit('Error list channels') + + num = len(in_list) + #for ch in in_list[30:35]: + for ch in in_list: + if ((num % 5) == 0): + #wait 15 seconds for every 5 muxes, if you are + #on a slower machine or Internet, increase delay or + #lower the modulus + sleep (15) + _a_count = 0 + active = wait_mux(hts_conn) + print ("active is in ch_list: %d" % active) + add_mux(hts_conn, ch) +# Slows things a bit, commented for now. +# for t in range(0, 30 + __delay, __delay): +# _a_count += 1 +# _a = wait_mux(hts_conn) +# print ("_a is for loop: " + _a + " _a_count: " + _a_count) +# +# if _a != active: +# active = _a +# break + num -= 1 + print ('Active %s - %d left' % (ch['mux_name'], num)).encode('utf8') + #print ('Channel details: %s' % json.dumps(ch, indent=4 * ' ')).encode('utf8') + add_tag(hts_conn, ch.get('tag', 'empty')) + + raw_input("Please, wait and verify all HTS services were initialized, before hitting any button to proceed.") + disable_auto_check_service(hts_conn) + map_services_to_channels(hts_conn) +# +# Leaving below for HTS 4.0 compatibility... not working in HTS4.1+ +# r = hts_conn.post('%s/%s' % (url, map_all), headers=_headers) +# if r.status_code != requests.codes.ok: +# print 'Error: %s -> %d' % (r.url, r.status_code) +# sys.exit('Error map all') + + sleep(5) + update_tag_in_channels(hts_conn, lst_all) + print("Mapping is done, import EPG from external source, such as: ") + print("http://epg.kodibg.org/dl.php and then run the next script: ") + print("update_channels.py - to link to the EPG and update channel icons!") + else: + print 'Error: %s -> %d' % (r.url, r.status_code) + sys.exit('Error list channels') + else: + print 'Login Error:\n%s' % r.content + sys.exit('Error list channels') diff --git a/settings.xml b/settings.xml new file mode 100644 index 0000000..fbb6f52 --- /dev/null +++ b/settings.xml @@ -0,0 +1,18 @@ + + + + + + + + + + + + + + + + + + diff --git a/update_channels.py b/update_channels.py new file mode 100755 index 0000000..73b5ee6 --- /dev/null +++ b/update_channels.py @@ -0,0 +1,116 @@ +#!/usr/bin/env python2.7 +# -*- coding: utf-8 -*- + +import os, sys +from time import sleep + +try: + import requests +except: + p = os.path.join(os.getcwd(), '..', 'script.module.requests', 'lib') + sys.path.insert(0, p) + import requests + pass + +try: + import simplejson as json +except: + p = os.path.join(os.getcwd(), '..', 'script.module.simplejson', 'lib') + sys.path.insert(0, p) + import simplejson as json + pass + +load_node = 'api/idnode/load' +save_node = 'api/idnode/save' +epggrab_ch_list = 'api/epggrab/channel/list' +channel_list = 'api/channel/list' + +_headers = { + 'User-Agent': 'Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:48.0) Gecko/20100101 Firefox/48.0', + 'Accept': 'text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8', + 'Accept-Language': 'en-US,en;q=0.5', + 'Accept-Encoding': 'gzip, deflate', + 'X-Requested-With': 'XMLHttpRequest', + 'Content-Type': 'application/x-www-form-urlencoded; charset=UTF-8', + 'Connection': 'keep-alive' +} + +def update_channels_epg_icon(hts_conn): + url = 'http://'+sys.argv[1]+':9981' + ee = [] + epg_icon = '' + r = hts_conn.get('http://epg.kodibg.org/dlmap.php') + if r.status_code != requests.codes.ok: + sys.exit('Error downloading EPG->BSC map file.') + + map_epg = r.json() +# Loads the map file from local json file +# with open("/tmp/epg_map.json", "r") as epg_json: +# map_epg = json.load(epg_json) +# print json.dumps(map_epg) + +# need to add EPG first, as this will not work properly if EPG is empty!!! + r = hts_conn.post('%s/%s' % (url, epggrab_ch_list), headers=_headers) + if r.status_code != requests.codes.ok: + print 'Error: %s -> %d' % (r.url, r.status_code) + sys.exit('Error list epggrab channel list.') + + epg_list = r.json().get('entries', []) + #print json.dumps(epg_list[0]) + r = hts_conn.post('%s/%s' % (url, channel_list), headers=_headers) + if r.status_code != requests.codes.ok: + print 'Error: %s -> %d' % (r.url, r.status_code) + sys.exit('Error list channels') + + all_ch = r.json().get('entries', []) + for ch in all_ch: + r = hts_conn.post('%s/%s' % (url, load_node), headers=_headers, data={'uuid': ch['key'],'meta': 1}) + if r.status_code != requests.codes.ok: + print 'Error: %s -> %d' % (r.url, r.status_code) + sys.exit('Error loading channel') + + channel_name = r.json()['entries'][0]['text'] + #print ("current channel name: " + channel_name), + #find channel name in the map and obtain the EPG name for it + for epg in map_epg: + #if channel_name == epg: + if channel_name.lower() == epg.lower(): + #it exists in the mapping file and matches current channel... get uuid from the existing EPG + #print ("map_epg.get(ch['text'] is: " + map_epg.get(r.json()['entries'][0]['text'])['id']) + #matched_name = map_epg.get(r.json()['entries'][0]['text'])['id'] + for e in epg_list: + current_epg_name = e['text'].split(':')[0] + #print ("current: " + current_epg_name) + #print ("e uuid : " + e['uuid']) + #if current_epg_name == map_epg[epg]['id']: + if current_epg_name.lower() == map_epg[epg]['id'].lower() or current_epg_name.lower() == epg.lower(): + ee = [e['uuid']] + epg_icon = 'http://logos.kodibg.org/'+current_epg_name.lower()+'.png' + print ("EPG and icon for "+channel_name+" will be updated") + break + break + + update = {'uuid': r.json()['entries'][0]['uuid']} + update['epggrab'] = ee + update['icon'] = epg_icon + #print json.dumps(update, indent=4 * ' ') + + r = hts_conn.post('%s/%s' % (url, save_node), headers=_headers, data={"node": '%s' % json.dumps(update)}) + if r.status_code != requests.codes.ok: + print 'Error: %s -> %d' % (r.url, r.status_code) + sys.exit('Error saving channel') + + ee = None + epg_icon = None + +def main(args): + return 0 + +if __name__ == '__main__': + if len(sys.argv) != 2: + sys.exit('\nWrong parameters\nusage: %s [hts hostname/ip]\n' % (sys.argv[0])) + print ("Did you add your EPG first? If not, do so first! Otherwise this will not work.") + raw_input ("Visit http://epg.kodibg.org/ to understand more.") + hts_conn = requests.Session() + update_channels_epg_icon(hts_conn) + sys.exit(main(sys.argv)) From 6939d973b3cf10da1bf6652d98488e9cf0cba850 Mon Sep 17 00:00:00 2001 From: Nikolay Kichukov Date: Sun, 24 Jun 2018 07:59:05 +0200 Subject: [PATCH 2/3] gz file needs zcat, fixed --- epg_fetch_upload.sh | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/epg_fetch_upload.sh b/epg_fetch_upload.sh index 48c590e..74e91e0 100755 --- a/epg_fetch_upload.sh +++ b/epg_fetch_upload.sh @@ -1,3 +1,3 @@ #!/bin/bash wget -O epg.xml.gz http://epg.kodibg.org/dl.php -cat epg.xml.gz | nc -q 1 -U /etc/tvheadend/epggrab/xmltv.sock +zcat epg.xml.gz | nc -q 1 -U /etc/tvheadend/epggrab/xmltv.sock From 55ad272c52b48e85e43b7314c416bdabb5371ad2 Mon Sep 17 00:00:00 2001 From: Nikolay Kichukov Date: Mon, 25 Jun 2018 18:57:54 +0200 Subject: [PATCH 3/3] Wrong settings.xml location and missing changes, fixed --- resources/settings.xml | 2 +- settings.xml | 18 ------------------ 2 files changed, 1 insertion(+), 19 deletions(-) delete mode 100644 settings.xml diff --git a/resources/settings.xml b/resources/settings.xml index fbb6f52..e0dd81d 100644 --- a/resources/settings.xml +++ b/resources/settings.xml @@ -5,7 +5,7 @@ - + diff --git a/settings.xml b/settings.xml deleted file mode 100644 index fbb6f52..0000000 --- a/settings.xml +++ /dev/null @@ -1,18 +0,0 @@ - - - - - - - - - - - - - - - - - -