I was wondering if there are any intents available that can be sent to the Android app. For example; it would be nice if my app could switch to a particular scene by sending an Intent to the Lifx app without needing to deal with the http or lan API myself. Anything like that available or planned?
We currently do not have any intents built into the application. I would be really curious to hear what intents you would like to see. I’m not promising anything, and nothing is yet planned.
I think it would be a nice alternative for using the http api. Most useful to me would be activating a scene currently. Quite similar to what is used for the widgets right now.
Excellent, I’ll forward this to our Android developers for consideration. I’ll let you know if something comes of it.
The effects (pulse and breathe) would be nice to have intents for, too. I can imagine this would be really useful for any app that does notifications. (An alarm clock and a messaging app would be two examples.)
Hey there, are there any updates on this subject?
I was tinkering with NFC tools and tasks recently and would love to be able to trigger an Intent by putting my phone near an NFC tag.
on/off/scene (by name or ID)/custom would be super nice to have for NFC usage
This has not yet made it to the product roadmap, I’ll update this thread if it does.
Aw. Thanks for the update!
I tried using Tasker with a few plugins, but one breaks bulb connectivity to other apps and another only allows for ON/OFF of all bulbs. Oh well, might be a lot quicker to write my own code.
Any update on this? I see there are some intents build into the app now.
From the manifest:
When I call TURN_ON for example, all lights turn on and the app opens. I just want the lights to turn on, nothing more. Any way to do this? How can I specify, what individual bulbs or groups should turn on? Some documentation on the other intents would be nice as well.
Those Intent’s were built for the Google Now integration, due to limitations on the functionality available adding light targeting wasn’t an requirement. Those intents will update the last light you had the control screen open at, or all lights if it doesn’t have a target.
Is there a reason that you don’t want to use the HTTP API for these use cases?
From an development perspective, integrating with intents is much easier…All the authentication and whatnot has already been done in the Lifx app, it would probably also be a smoother experience.
And the 1-3sec lag would be gone too
Yes intents would be very very handy to have
Is there any update on this? I agree that this is much better for integration than the HTTP API, which is pretty laggy, especially from Australia.
I just checked the latest APK, and found these :-
Is there any documentation available on them?
It would be good to be able to start the app too (I’ve got an old android phone on my wall as a controller), although I can probs just do that with Tasker.
@markh would be great to get an answer here. I’m building an android app for myself using the great snips.ai voice rec setup, and using intents would be much smoother than having to code everything up myself. See the good above points from others about why. And it sounds like all the intents are already there, so it would just be a matter of providing some documentation… The main two reasons I really want it is the API is way too slow from Australia, and it would be much easier and snappier using the intents.
Please just document this Your Android SDK is a mess and unmaintained, I’ve spent hours trying to extract the intent info myself and hit many walls, and I really don’t want to use the API if I can possible avoid it.
This pulls your app to foreground but nothing else. What are the flags etc I need to send to make at least on and off work? Please help me, I’m getting very frustated…
Intent sendIntent = new Intent(); sendIntent.setAction("com.lifx.lifx.actions.TURN_ON"); startActivity(sendIntent);
@Shanness the Intents that are currently in the app were not built for 3rd party developer use. They were originally built for the Google Now implementation.
We can look at adding Intent handling into a future app release.
What Intents would you like us to handle?
Well the stuff that google now uses is probably what I’d need. i.e. it seems to do most of what I need… Would be great to query and activate scenes and other cool stuff like music vis, pastels etc too.
So it seems I’m left with either reverse engineering the existing intents, using the slow API (it’s slow when added to voice, hence why I assume you added intents for Google Now?), or trying to figure out how to get something like your unmaintained (and pretty horrible, I looked at the source last night) Android sdk LAN protocol working?
None of these are much fun, and if you won’t help, I suppose I’ll continue to try to reverse engineering the existing intents first. Working on a broken but promising APK decompiler ATM so I can get the docs you won’t give me…
Not impressed… Or I suppose I could just buy some lights that support this better?
Google Now worked in a completely different way to the newer Google Assistant integration. It was deprecated by Google a couple of years ago. Those Intents could be updated for more general purpose usage.
The Android LAN Sdk is long since deprecated, I wouldn’t recommend using that. There is a newer library that you could look at using :
Hmm, OK… That Kotlin library doesn’t seem to support ALL ON/OFF. (neither did yours seem to for that matter). The most basic feature I need doesn’t seem supported (would require hacking their code?)
So I suppose the big question here, is how does the “Google Assistant integration” you mentioned work? It seems to fast to me to be using the API? And I don’t want to make a step backwards just to get voice going (GA doesn’t work well for my system, I’ve got a phone on the wall, and GA doesn’t seem to support a user with more than one phone. That and the privacy issues)