Using an HTTP proxy server to stream IPTV sources - rest

I don't know if this is the right place to ask this question but I'm hoping to find directions here.
I have a smart tv and I like to watch tv from my country with the SSIPTV app. I found an android app that streams local channels, so I checked the requests with android studio to find the streaming links. Some of them are free, but some others are served through cloudfront. The problem is that I can't add a header needed for cloudfront to authorize the request.
For example: when I try to make a request without the "User-Agent" header, the response is this:
Status Code: 403 Forbidden
Connection: keep-alive
Content-Length: 560
Content-Type: text/html
Date: Tue, 01 Jan 2019 20:57:50 GMT
Server: CloudFront
Via: 1.1 f7e7b00c5c66a4e43041ba24c378d07a.cloudfront.net (CloudFront)
X-Amz-Cf-Id: uZQAVTrQzHsQe2vGyHxY1OYfjHCL-Nz7gCTG-koHcgr1A5HG7fGGOg==
X-Cache: Error from cloudfront
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN" "http://www.w3.org/TR/html4/loose.dtd">
<HTML>
<HEAD>
<META HTTP-EQUIV="Content-Type" CONTENT="text/html; charset=iso-8859-1">
<TITLE>ERROR: The request could not be satisfied</TITLE>
</HEAD>
<BODY>
<H1>403 ERROR</H1>
<H2>The request could not be satisfied.</H2>
<HR noshade size="1px">
Request blocked.
<BR clear="all">
<HR noshade size="1px">
<PRE>
Generated by cloudfront (CloudFront)
Request ID: TZztsUjltHpEhx54wplzupvLmZwjCRPtAvTcbdJ8DL16b1k9-_XwZw==
</PRE>
<ADDRESS></ADDRESS>
</BODY>
</HTML>
But if I set the "User-Agent" header with the value "iPhone" this is the response:enter code here
Status Code: 200 OK
Accept-Ranges: bytes
Access-Control-Allow-Credentials: true
Access-Control-Allow-Headers: Content-Type, User-Agent, If-Modified-Since, Cache-Control, Range
Access-Control-Allow-Methods: OPTIONS, GET, POST, HEAD
Access-Control-Allow-Origin: *
Access-Control-Expose-Headers: Date, Server, Content-Type, Content-Length
Cache-Control: max-age=1
Connection: keep-alive
Content-Length: 366
Content-Type: application/vnd.apple.mpegurl
Date: Tue, 01 Jan 2019 20:51:32 GMT
Server: WowzaStreamingEngine/4.7.6
Via: 1.1 880eb84cefca849ee159a7c4d89c31ea.cloudfront.net (CloudFront)
X-Amz-Cf-Id: pogc8_OBsN2-QeGj_1q8K_vyxrQH-G8a2JmWqSkVt9x57NlbKfDSdQ==
X-Cache: Hit from cloudfront
So, is there a way I could set up a proxy to add the request and then get the content served in my tv app?

If you can configure your TV app to use an HTTP proxy, then this is straightforward, eg in squid this is documented here: http://www.squid-cache.org/Doc/config/request_header_add/
request_header_add User-Agent "iPhone"

Related

Http API gateway error "too much redirect" and ALB health check move from 200 to 302 after deploying code

i have backend server in elastic beanstalk with rds in private subnet, so to access backend i have created http api gateway but problem is when i click on http api gateway endpoint i got error too many redirect.i am facing this problem for last 2 week, applied many solution but no progress,
onething more i oberseved ,when i deployed code on EB then Load balancer health ceck move fromm 200 to 302, and i got error "health check failed with these code 302",but when i set 302 as health check status code instance become health,but still same error in api gateway
curl localhost
also tried to redirect http to https from ALB
and also tried without https 443
Connected to localhost (127.0.0.1) port 80 (#0)
> GET / HTTP/1.1
> Host: localhost
> User-Agent: curl/7.79.1
> Accept: */*
>
* Mark bundle as not supporting multiuse
< HTTP/1.1 302 Found
< Server: nginx/1.22.0
< Content-Type: text/html; charset=UTF-8
< Transfer-Encoding: chunked
< Connection: keep-alive
< Cache-Control: no-cache, private
< Date: Thu, 12 Jan 2023 11:48:54 GMT
< Location: https://localhost
<
<!DOCTYPE html>
<html>
<head>
<meta charset="UTF-8" />
<meta http-equiv="refresh" content="0;url='https://localhost'" />
<title>Redirecting to https://localhost</title>
</head>
<body>
Redirecting to https://localhost.
</body>
* Connection #0 to host localhost left intact

Does google chrome and similar browsers support range headers for standard downloads

My initial response headers - notice the Accept-Ranges header
HTTP/1.1 200 OK
X-Powered-By: Express
Vary: Origin
Access-Control-Allow-Credentials: true
X-RateLimit-Limit: 1
X-RateLimit-Remaining: 0
Date: Thu, 08 Apr 2021 06:14:19 GMT
X-RateLimit-Reset: 1617862461
Accept-Ranges: bytes
Content-Length: 100000000
Content-Type: text/plain; charset=utf-8
Content-Disposition: attachment; filename="some_file.txt"
Connection: keep-alive
Keep-Alive: timeout=5
I then restart the server and click resume download in chrome, but chrome doesn't send back in Range request headers
I'm following the documentation on Mozilla's website
Am I missing a header or misunderstanding how this works, especially with chrome and other browsers? Is there another way I can manually support resuming downloads by sending the right response and understanding the right request? From a technical perspective, if chrome sends back which range it now needs I will be able to resume the download.
According to this article, chrome should support something like this. I just need to be pointed in the right direction.
Ty!
Chrome needs some way to know that the file it's trying to download at that URL is indeed the same file when it tries to resume.
If you add support for an ETag header, this will likely work.

IE11 - LOSING MY COOKIES: 302 redirect from HTTP to HTTPS

Using IE11 I am making a get request to SITE A:
GET http://www.test.com/?documentId=ef746317-7711-4458-8873-a73700fc1b85 HTTP/1.1
Accept: image/jpeg, application/x-ms-application, image/gif, application/xaml+xml, image/pjpeg, application/x-ms-xbap, application/vnd.ms-excel, application/vnd.ms-powerpoint, application/msword, */*
Accept-Language: en-US
Accept-Encoding: gzip, deflate
User-Agent: Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 6.2; WOW64; Trident/7.0; .NET4.0E; .NET4.0C; .NET CLR 3.5.30729; .NET CLR 2.0.50727; .NET CLR 3.0.30729)
Connection: Keep-Alive
Host: www.test.com
I receive a redirect with 2 cookies:
HTTP/1.1 302 Found
Date: Wed, 15 Mar 2017 23:48:00 GMT
Content-Type: text/html; charset=UTF-8
Location: https://www.newSite.com/test/Edit/ef746317-7711-4458-8873-a73700fc1b85
Set-Cookie: Auth=EAAAAIQfMoK32BNjBypXapcJppWc==; path=/; secure
Set-Cookie: Auth=EAAAAN+xPT6eioV8LESTR6CViGIvc834gP==; path=/; secure
Cache-Control: private, s-maxage=0
Server: Microsoft-IIS/10.0
X-AspNet-Version: 4.0.30319
X-AspNetMvc-Version: 4.0
X-Powered-By: ASP.NET
P3P: CP="NOI ADM DEV PSAi COM NAV OUR OTR STP IND DEM"
P3P: policyref="/w3c/p3p.xml", CP="IDC DSP COR IVAi IVDi OUR TST"
Content-Length: 0
IE would appear to be following the redirect and does a GET, but as you can see is not sending back the cookies:
GET https://www.newSite.com/test/Edit/ef746317-7711-4458-8873-a73700fc1b85 HTTP/1.1
Accept: image/jpeg, application/x-ms-application, image/gif, application/xaml+xml, image/pjpeg, application/x-ms-xbap, application/vnd.ms-excel, application/vnd.ms-powerpoint, application/msword, */*
Accept-Language: en-US
Accept-Encoding: gzip, deflate
User-Agent: Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 6.2; WOW64; Trident/7.0; .NET4.0E; .NET4.0C; .NET CLR 3.5.30729; .NET CLR 2.0.50727; .NET CLR 3.0.30729)
Connection: Keep-Alive
Host: www.newSite.com
And then of course is the 401:
HTTP/1.1 401 Unauthorized
Content-Type: text/html
Server: Microsoft-IIS/10.0
X-Powered-By: ASP.NET
Date: Wed, 15 Mar 2017 23:48:00 GMT
Content-Length: 1293
<HTML>Blah blah blah access denied error</HTML>
I tried adding the P3P headers to force IE to send the cookies in the redirect but no dice. I have read there may be an issue with IE sending cookies in redirect when going from HTTP to HTTPS, or because the "secure" cookies are being sent back to the browser on HTTP which when redirects to HTTP sees the different domain and chokes. I cannot change the web sites as they are vendors, but I can alter the 302 being sent back to IE11 with the interface middle-ware I am working on. Any thoughts on how I can trick/force IE to send back these cookies on the redirect?
Update 1: I have tried Firefox 52, IE11, and Chrome. No browser is accepting that 302 and sending the get back with the cookies. Someone out there must understand how redirects with cookies work. No responses makes me wonder if this site is reaching the right people.
I got around my issue with sort of a hack. Since the browser would not forward cookies back in a 302 redirect, I just send back a little page that does the posting for me, instead of my interface software.
<!DOCTYPE html>
<html>
<head>
<title>Redirect</title>
</head>
<body>
<form action="https://testAPI.test.com/" method="POST">
<input name="UserName" value="Test APIUser"/>
<input name="UserEmail" value="test#test.com"/>
<input name="PatientId" value="1d11eb2e-2606-485e-ad5d-a70c00daa37a"/>
<input name="Timestamp" value="Mon, 20 Mar 2017 19:11:24 GMT"/>
84c6-a7210111648b"/>
<input name="Token" value="MRVp/pBRBJ08F8cYMavfL8 ="/>
</form>
<script language="javascript"> window.setTimeout('document.forms[0].submit()', 0);</script>
</body>
</html>
You can disable Protected Mode. Protected Mode is designed to prevent malicious software from exploiting vulnerabilities in Internet Explorer 11. This mode may also block cookies depending on the current setup.
Open Internet Explorer 11.
Click Tools and then select Internet options.
Go to Security tab.
Under Security level for this zone, clear the check box for Enable Protected Mode (requires restarting Internet Explorer).
Click OK.
Close Internet Explorer 11 and then launch it again.

Object Debugger 404 error

I have checked similar questions asked, but none seem to match the circumstances of this one.
This page is returning a 404 error in Facebook's Object Debugger tool. Other pages on the site work okay, so it shouldn't be any missing meta tags.
Now some of the page content is hidden, but only some, the majority of the page content is available, so surely this shouldn't be causing the issue. If it does then that would have to be regarded as a bug, no?
Anyone have any idea what the issue might be and/or how to fix?
The error message is accurate - your URL is returning a 404 when the Facebook crawler attempts to get the metadata
You'll need to check your server settings or the code which renders that URL to see why it's doing so, here's the output when i made the same request Facebook makes from my own laptop:
$ curl -A "facebookexternalhit/1.1" -i 'http://austparents.edu.au/webinars/parent-webinar-on-the-australian-curriculum-with-rob-randall-ceo-acara/'
HTTP/1.1 403 Forbidden
Date: Tue, 23 Sep 2014 00:03:36 GMT
Server: Apache/2.2.14 (Ubuntu)
Vary: Accept-Encoding
Content-Length: 366
Content-Type: text/html; charset=iso-8859-1
<!DOCTYPE HTML PUBLIC "-//IETF//DTD HTML 2.0//EN">
<html><head>
<title>403 Forbidden</title>
</head><body>
<h1>Forbidden</h1>
<p>You don't have permission to access /webinars/parent-webinar-on-the-australian-curriculum-with-rob-randall-ceo-acara/
on this server.</p>
<hr>
<address>Apache/2.2.14 (Ubuntu) Server at austparents.edu.au Port 80</address>
</body></html>

Google share and Facebook sharer not pulling through information

I am running an MVC application on IIS.
when sharing a URL on either facebook (https://developers.facebook.com/tools/debug/og/object?q=http%3A%2F%2Ftn.hollaroo.com%2Fcontent%2Faux%2Fhollaroo%2Findex.html) or google+ (https://plus.google.com/share?url=http://tn.hollaroo.com/content/aux/hollaroo/index.html) it works with STATIC.html
When I am trying to do the same thing with tn.hollaroo.com/terms - no meta-data (title, description, image) is pulled through. index.html is a "view source + save as html" copy of /terms, so I doubt that the error is in the HTML itself.
Header section as follows
<!DOCTYPE html>
<html xmlns="http://www.w3.org/1999/xhtml"
xmlns:og="http://ogp.me/ns#"
xmlns:fb="http://www.facebook.com/2008/fbml"
itemscope itemtype="http://schema.org/Article">
<head runat="server">
<meta charset="utf-8" />
<title></title>
<meta itemprop="name" content="Hollaroo Trusted Network">
<meta property="og:description" content="The Trusted Network ...">
<meta name="description" content="The Trusted Network ..." />
<meta property="og:title" content="Hollaroo Trusted Network" />
<meta property="og:type" content="website" />
<meta property="og:image" content="http://tn.hollaroo.com/content/aux/hollaroo/images/posting.jpg" />
<meta property="og:site_name" content="Hollaroo - Private Social Recruitment Networks" />
is all there.
I have run CURL and the main difference I spot there is that I am trying to set a cookie:
~# curl -I http://tn.hollaroo.com/terms
HTTP/1.1 200 OK
Cache-Control: private
Content-Length: 26085
Content-Type: text/html; charset=utf-8
Set-Cookie: ASP.NET_SessionId=wscxgkryniqa0qd3dmukjpxe; path=/; HttpOnly
Date: Fri, 07 Mar 2014 16:03:09 GMT
~# curl -I http://tn.hollaroo.com/content/aux/hollaroo/index.html
HTTP/1.1 200 OK
Cache-Control: public
Content-Length: 26400
Content-Type: text/html
Last-Modified: Fri, 07 Mar 2014 15:44:21 GMT
Accept-Ranges: bytes
ETag: "e076f21b1c3acf1:0"
Date: Fri, 07 Mar 2014 16:03:32 GMT
The /terms url does not require login.
According to my IIS log AND to my own log in the app - I do get hits from facebook and I do return data:
IIS LOG:
2014-03-07 15:05:44.590 /terms - "D:\WEBS\Edge\terms" 200 "DEMO1" - 0 0 225
2014-03-07 15:05:54.605 /terms fb_locale=en_GB "D:\WEBS\Edge\terms" 200 "DEMO1" - 0 0 267
Url UserId IPAddress Browser At
/terms NULL 173.252.100.117 facebookexternalhit/1.1 (+http://www.facebook.com/externalhit_uatext.php) 2014-03-07 15:05:44.263
/terms?fb_locale=en_GB NULL 173.252.100.113 facebookexternalhit/1.1 (+http://www.facebook.com/externalhit_uatext.php) 2014-03-07 15:05:54.387
I am not disallowing web crawlers or blocking fb's IP.
Thank you very much for your help!