Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.

Messages - DeltaMikeCharlie

Pages: [1] 2 3 4
General Discussions / Re: Some Improper Character Encoding
« on: March 24, 2021, 06:06:31 PM »
For now, I have a working AWK command that will search for the twice-encoded characters and restore them to their once-encoded values.

cat input.json | awk '{gsub(/\xC3\xA2\xE2\x82\xAC\xC2\x9D/,"\xE2\x80\x9D");gsub(/\xC3\xA2\xE2\x82\xAC\xC5\x93/,"\xE2\x80\x9C");gsub(/\xC3\xA2\xE2\x82\xAC\xCB\x9C/,"\xE2\x80\x98");gsub(/\xC3\xA2\xE2\x82\xAC\xE2\x84\xA2/,"\xE2\x80\x99");}1' > output.json

I will have to insert this command between the EPG fetch from ICE and the EPG import command for my application.  Luckily, the process is already controlled by a script so adding a new line will be trivial.

I'm sure that there will be other characters that I will need to address in the future, but these cover the ones that I have noticed at the moment.

I may even consider converting the fancy Unicode quotes and apostrophes into standard ASCII characters too.

General Discussions / Some Improper Character Encoding
« on: March 23, 2021, 03:34:28 PM »
I've found that both my PVR (JSON feed) and your web site show what appears to be invalid characters, for example:

Your web site EPG shows:

Setsuko, a 55-year-old single office lady in Tokyo

These appear to be Unicode characters that have had their UTF-8 encoding expanded unnecessarily.

‘ in hex is E2 80 98 which is the Unicode codepoint 'LEFT SINGLE QUOTATION MARK' (U+2018) in UTF-8
Likewise, ’ is E2 80 99 which represents codepoint 'RIGHT SINGLE QUOTATION MARK' (U+2019)

There may be more, but these are the ones that I have noticed.  A search for "" on the web site should find them.  I have seen them in both the title and description.

The JSON feed is slightly worse:

Setsuko, a 55-year-old single ‘office lady’ in Tokyo

‘ = C3 A2 E2 82 AC CB 9C

’ = C3 A2 E2 82 AC E2 84 A2

It appears to have been encoded as UTF-8 twice.

From what I've read online, this seems to be a common issue regarding incorrectly interpreting encoding schemes.

Perhaps ICE is receiving these characters pre-corrupted from an upstream source or they are incorrectly encoding them for web or JSON presentation.

Perhaps ICE TV could search for these characters and replace them with their originally intended UTF8 equivalents prior to making that data available on their platform.  As these particular strings of characters are nonsensical, inadvertently changing the intended meaning would be a fairly low risk.

IceTV EPG Content / Frasier actor credit
« on: October 17, 2020, 09:11:26 AM »
In Frasier, it appears that "John Mahoney" is being credited as "Joh Mahoney".

XMLTV (General) / Re: Timer status not updating
« on: July 26, 2018, 11:53:54 AM »
Hi Daniel,

Did you receive the email that I sent on 17-June?

Has there been any progress on this issue?

XMLTV (General) / Timer status not updating
« on: June 16, 2018, 10:47:30 AM »
I'm not sure if I am doing something wrong, or if there is a bug in the API.

When I receive a new timer from ICE with a "waiting" status, I create the timer on the PVR and then send back a status update to "pending".  However, this status is not always reflected on the ICE portal.  Normally, the timer icon on the portal will change from "Queued Single Recording" to "Single Recording" when the "pending" status is returned.

It seemed to happen more prominently for the last timer in a group of timers.  However, I just did a quick test as I was preparing for this post and I found that it seems to always be the last timer that is not correctly updated, regardless of how many were processed.

As a test:  I started with 2 groups of 3 timers.  In the first group, all 3 timers showed as "Single Recording", however, in the second group, only the first 2 showed as "Single Recording" and the third showed as "Queued Single Recording".  The timers were in this state for several polling cycles and in my log file I could see ICE continuously sending "waiting" messages for the "Queued Single Recording" timer, even though I was continuously sending "pending" responses.

I then created another single timer and then the last timer of the second group suddenly became a "Single Recording" and now the new single timer appears to be stuck on "Queued Single Recording".

Has any one else encountered this issue?

Also, this only seems to happen with timers sent from ICE.  If I create a timer on the PVR and then send it to ICE, the ICE portal immediately shows the timer as "Single Recording" but leaves the previous "Queued Single Recording" still in its "waiting" state.

Perhaps a workaround would be to never accept an ICE timer.  Always send a deletion request back to ICE for timers created by the portal, but then create a manual timer on the PVR for the same event and post that manual timer to ICE as a new timer.

XMLTV (General) / Re: Phantom Timer
« on: April 11, 2018, 09:03:23 AM »
I may have resolved this, but I'm still collecting evidence.

This timer had been previously uploaded to ICE as having been created on the PVR.  My working hypothesis is that the underlying event changed so ICE sent an updated timer to the PVR.

XMLTV (General) / Phantom Timer
« on: April 10, 2018, 11:53:09 AM »
I seem to have received a phantom timer:

<timer id="16386666313566939991" name="NCIS: Los Angeles" device_id="nnnnnn" channel_id="5" show_id="138115637" start_time="2018-04-10T11:40:00+00:00" duration_minutes="55" action="record" state="pending" message="Updated by Scheduler SERIES" series_recording_id="0" keyword_id="0" pre-padding="-1" post-padding="-1" padding-type="" />

I have looked on my web interface and I can see no matching "My Series" entry.  The only entry that I see matches the entry that the API returns.

<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<!DOCTYPE keyword SYSTEM "">
 <keyword id="143114" match_pattern="episode:&quot;testselection&quot; " type="Record" device_id="nnnnnn" device_label="PVR-Name" network_id="-1" channel_id="-1" recording_quality="Prefer HD" matches_per_day="0" airing="First runs and re-runs" marked_for_deletion="0" />

I also looked in '/series/favourites' and also found nothing.

I created a test search containing "NCIS" some time ago, but is has been deleted for quite a few days.

Am I doing something wrong, or is there a problem on ICE's side?

XMLTV (General) / Re: XMLTV vs ICETV Products
« on: April 05, 2018, 05:10:51 PM »
Thanks Daniel.

So does XMLTV provide data according to this schema:


And ICETV according to this one?

<!DOCTYPE shows SYSTEM "">

XMLTV (General) / XMLTV vs ICETV Products
« on: April 05, 2018, 04:17:25 PM »
On the web site, there are 2 subscription types described: "ICETV" and "XMLTV".  It's clear that XMLTV excludes the "full service" features of the more expensive ICETV service.  Using the ICETV service, I have been able to obtain data formatted in both XML and JSON.

Can you please confirm if the "XMLTV" service restricted to XML-formatted data, or if JSON-formatted data is also available?

XMLTV (General) / Rename Device XML vs JSON
« on: March 23, 2018, 12:00:40 PM »
When I make the following XML request, it is successful:

Code: [Select]
<device uid="aa:bb:cc:dd:ee:ff" label="NewName" type_id="33" />

However, when I make what appears to be an equivalent JSON request, it returns a http status of 200, but the device is not modified:

Code: [Select]
"devices": [{
"uid": " aa:bb:cc:dd:ee:ff ",
"label": "NewName",
"type_id": "33"

I have tried including the "types" object and/or the id property as per the sample documentation, but that makes no difference.

Even though the documentation says to use a POST, that always returns a "405 - Method not allowed" but a PUT returns a 200.  The XML version also needs a PUT, not a POST.

I was converting my app from XML to JSON, but I guess that I will keep this call using XML.

XMLTV (General) / Re: Timer creation question.
« on: March 12, 2018, 02:20:26 PM »
Thanks Daniel.

XMLTV (General) / Timer creation question.
« on: March 12, 2018, 10:53:23 AM »
I would like to know what ICE's recommended practice is when a timer is scheduled for a channel that has multiple LCNs associated.

For example, in Sydney, "ABC" is available on 2 LCNs: 2 and 21.

<channel id="2" name="ABC" name_short="ABC" network_id="2" region_id="1" is_hd="0" lcns="2,21" region_name="NSW - Sydney" network_name="ABC" is_hidden="0" icon_src="" icon_width="0" icon_height="0">
  <dvb original_network_id="4112" transport_stream_id="545" service_id="545" />
  <dvb original_network_id="4112" transport_stream_id="545" service_id="547" />


When a timer is created for "channel id=2", it theoretically should spawn 2 timers, one for each of the LCNs.

<timer id="1234567890" name="Grand Designs" device_id="1234" channel_id="2" show_id="137486682" start_time="2018-03-12T00:00:00+00:00" duration_minutes="60" action="record" state="waiting" message="Created with Interactive web interface." series_recording_id="0" keyword_id="0" pre-padding="-1" post-padding="-1" padding-type="" />

I know that I could: 1) ask the user to designate a preferred LCN; or 2) arbitrarily pick one, I'd just like to know what ICE would expect to happen.

XMLTV (General) / Re: Strange JSON escape characters in EPG data.
« on: February 20, 2018, 06:13:42 AM »
I sent a long and detailed email about this and a related problem to Daniel Hall a few weeks ago. I haven't heard back.
Thanks prl.

XMLTV (General) / Re: New XML Protocol & Channel Data Inconsistencies
« on: February 20, 2018, 06:12:29 AM »
As a stab in the dark, I tried to explicitly request EPG data for a channel that was excluded in my preferences using the "&channel_id=xxxx" parameter in my request.  Unfortunately, I received an empty response.

<?xml version="1.0"?>
<shows page="1" rows_per_page="0" total_rows="0" last_update_time="1519067101" />

Would it be possible to add an "ignore_ch_prefs=1" (or similar) parameter to the EPG request?

XMLTV (General) / Strange JSON escape characters in EPG data.
« on: February 19, 2018, 05:12:14 PM »
I have noticed a difference in the "show" description text depending on if the selected format is JSON or XML.

Here is an XML example.  Note the quotes around the first line of the description.

  <desc lang="en">I wanted to know what their plan was. I was their plan!

The Doctor has been summoned by an old friend, but in the Cabinet War Rooms far below the streets of blitz-torn London, it's his oldest enemy he finds waiting for him...

The Daleks are back - but can Winston Churchill be in league with them? </desc>

However, the JSON version has a series of escape characters that do not appear to correspond to the character shown in the XML version.

    "desc": "\u00E2\u0080\u009CI wanted to know what their plan was. I was their plan!\u00E2\u0080\u009D\r\n\r\nThe Doctor has been summoned by an old friend, but in the Cabinet War Rooms far below the streets of blitz-torn London, it's his oldest enemy he finds waiting for him...\r\n\r\nThe Daleks are back - but can Winston Churchill be in league with them? ",

"\u00E2\u0080\u009C" actually appears to be the UTF-8 representation of "LEFT DOUBLE QUOTATION MARK " character which is "0x201C" in hex.

I have encountered a number of web sites suggesting that the correct JSON encoding should actually be "\u201C".

I have also seen this occur with "\u00E2\u0080\u009D" (right double quote "\u201D") and "\u00E2\u0080\u0093" (en dash "\u2013").  Perhaps there are others.

I'm happy to be proven wrong, but I thought that files containing JSON were already supposed to be in Unicode and that only reserved characters (such as quotes, commas, etc) needed to be escaped.

It appears that the process that is creating the JSON output is reading the source as ASCII text and not UTF-8 text and converting each byte of the Unicode string individually and not as a combined entity.

I am using cURL with [--header "Accept: application/json"].  I was willing to concede that perhaps cURL is mangling the data on the way in, however, the trace file shows the data arriving pre-mangled.

Is there something wrong with my request or is the server genuinely serving up these seemingly erroneously escaped characters?

Pages: [1] 2 3 4