Using Apache Knox in non Hadoop app - knox-gateway

Does it makes sense to use Knox (+LDAP) as authentication proxy to application that is not using Hadoop at all?
I'm new to this domain and have heard about such possibility but I don't quite get it. Maybe there are some viable alternatives?

Absolutely ! you can use Apache Knox without Hadoop. I use it to secure my Raspberry Pi Motion setup, this is a link to the blog post (it is still work in progress but you'll get an idea) Securing Raspberry Pi Security Camera (or UIs) using Apache Knox

Related

Can I use ESP8266 with edgeSDK?

I've started testing edgeSDK in a prototype IOT environment.
The idea is to connect devices with sensors and other nodes (Raspberry Pi, ESP8266, macOS, etc.) and exchange data or messages between them on the edge, trying to avoid communicating through the cloud.
(I will be also "mirroring" this exchanges in an AWS central cloud environment, to establish some comparisons/evaluations).
At this point, I have edgeSDK running on macOS and the Raspberry Pi and would like to add ESP8266 into the mix.
My Question is:
Can I get ESP8266 to work with edgeSDK? I don't see it listed as a supported platform.
If yes, which OS? (I was thinking about Mongoose, keeping the JavaScript coding and follow the standard).
Any other comments/suggestions or similar references would be very welcome!
ESP8266 is a microcontroller, which edgeSDK does not support. However, you can run a RESTFul API client on ESP8266 to call a API served by a microservice hosted by edgeSDK on a Raspberry Pi for example.

Leshan connect to server and cloud

I have the task of implementing iot device management using Eclipse Leshan. I have difficulty understanding how Eclipse Leshan works in connecting IOT sensors with servers and cloud. Is it true if I declare that Eclipse Leshan does not require a gateway like Eclipse Kura to connect into server and cloud?
Does anyone know where the complete documentation about Eclipse Leshan is? it would be very helpful if there were examples of programs in implementing the eclipse leshan.
Thank you
Eclipse Leshan is a library for implementing applications that use the LWM2M protocol to manage devices. As such, your application can use Leshan's Java API in order to interact with devices that also support LWM2M.
LWM2M does not per se mandate a transport protocol. However, the spec is written assuming that CoAP over UDP is used for that purpose. In fact, the LW in LWM2M stands for Lightweight and as such, using CoAP as the transport protocol makes a lot of sense for managing constrained devices.
Eclipse Leshan itself does not connect to a server or cloud but instead is usually part of an application that is hosted on a server (on the cloud). However, you need to implement that application yourself because Leshan, as indicated above, is just a library. The devices then interact with your LWM2M enabled application. Because CoAP/UDP uses standard IP, this interaction can occur over public internet infrastructure if desirable in your use case, i.e. no gateway is necessarily needed. You can, however, also connect your devices to a local gateway, e.g. Kura, and then connect the gateway to your LWM2M server in the cloud instead. It really depends on your use case and the capabilities of the devices.

Solaris MQ accessing Message Hub in Bluemix via secure gateway

I'm testing to bridge messages from WMQ in Solaris to Message Hub - Bluemix and vice versa using Secure Gateway.
Since secure gateway client is not available for Solaris , what is the option for using secure gateway client in Solaris ?
#Vignesh, you are correct. The Secure Gateway team does not support Secure Gateway on Solaris. Unfortunately, you are on your own here.
If you are desperate to get this working you can unpack the .deb or .rpm packages and see if you can replicate the installation yourself, but translated for Solaris.
I would warn that I think this may also not be feasible as it looks like the application run time for Secure Gateway, NodeJS, may or may not be supported on Solaris.
Your best bet would be to run on Ubuntu, or RHEL if possible.

Send data from server to a client (Raspberry Pi) without pull request

I have a Raspberry Pi and I am developing an application on the Pi that can be controlled by a web portal.
So I need to know, if I change something in my website, how will that be transferred to my Pi which is a client without any pull request from the client.
One solution could be to install Apache on your Raspberry and setup a basic http PHP or Python API. When a change is posted on the website, the back-end script makes a API call to the Raspberry API service.
If you are using PHP as your web server, you could use json_decode(file_get_contents(...) to access the Raspberry API.
I'll suggest you to use Websockets.
Websockets are bidirectional and client and server can communicate whenever they want as TCP session is ongoing. So, yo will not need to do polling.
You can download and compile libwebsockets for your raspberry as server or as I did in one of my previous projects, you can install nodejs into the raspberry and use socket.io library to handle all. Of course, you will need to do some modifications in your web page to behave like websocket client or socket.io client.
Good luck!

HDFS web interface alternative

Alright, this is annoying! I am new to Hadoop. And I am trying to find decent alternative to basic HDFS web interface. i tried with hadoop eclipse plugin but seems it's oudated already and it's pain to set it up correctly! I have cloudera's distribution installed and I heard about cloudera desktop but it's no longer available. Can anybody tell me decent alternative to HDFS web interface where I can download and upload files to HDFS via GUI easily? P.S I am running everything on my local no, cluster involved. Tried a lot to find , but nothing seems to be pointing towards right direction
You can use webhdfs of which REST API supports the complete FileSystem interface for HDFS. http://hadoop.apache.org/docs/r1.0.4/webhdfs.html
OR
You can integrate hadoop with hoop(HDFS over HTTP), which is used to access HDFS via HTTP protocol. Hoop provides access to all Hadoop Distributed File System (HDFS) operations (read and write) over HTTP/S
for more details please refer.
http://bigobject.blogspot.in/2013/03/hoop-https-over-hdfs.html
or also you can user HTTPFS as a option to Hoop
http://bigobject.blogspot.in/2013/03/apache-hadoop-httpfs-service-that.html