|Philippe G a5620b56f3||5 days ago|
|aircast||5 days ago|
|airupnp||5 days ago|
|bin||5 days ago|
|common||5 days ago|
|tools||7 months ago|
|.gitattributes||3 years ago|
|.gitignore||3 years ago|
|CHANGELOG||5 days ago|
|LICENSE||2 years ago|
|README.md||1 week ago|
|airupnp.service||2 years ago|
|updater||3 months ago|
Use these applications to add AirPlay capabilities to Chromecast and UPnP (like Sonos) players, to make them appear as AirPlay devices.
AirConnect can run on any machine that has access to your local network (Windows, MacOS, Linux -x86, x64 and ARM, Solaris and FreeBSD). It does not need to be on your main computer. (For example, a Raspberry Pi works well). It will detect UPnP/Sonos/Chromecast players, create as many virtual AirPlay devices as needed, and act as a bridge/proxy between AirPlay clients (iPhone, iPad, iTunes, MacOS, AirFoil ...) and the real UPnP/Sonos/Chromecast players.
The audio, after being decoded from alac, can be sent in plain, or re-encoded using mp3 or flac. Most players will not display metadata (artist, title, album ...) except when mp3 re-encoding is used and for UPnP/DLNA devices that support icy protocol. Chromecast players do not support this (yet).
aircast-osx-multifor Chromecast on OS X.)
airupnp-osx-multifor UPnP/Sonos on OS X.)
For Windows, download all the .dll as well.
Store the [executable] (e.g.
airupnp-osx-multi) in any directory.
On non-Windows machines, open a terminal and change directories to where the executable is stored and run
chmod +x [executable]. (Example:
chmod +x airupnp-osx-multi). Note that if you choose to download the whole repository (instead of individual files) from you web browser and then unzip it, then in the bin/ sub-directory, file permissions should be already set.
In Docker, you must use ‘host’ mode to enable audio webserver
Double click the [executable] or launch it by typing
./[executable] in the same command line window.
For Sonos & Heos players, set latency by adding
-l 1000:2000 on the command line. (Example:
./airupnp-osx-multi -l 1000:2000)
You should start to see lots of log messages on screen. Using your iOS/Mac/iTunes/Airfoil/other client, you should now see new AirPlay devices and can try to play audio to them.
If it works, type
exit, which terminates the executable, and then, on non-Windows/MacOS machines, relaunch it with
-z so that it can run in the background and you can close the command line window. You can also start it automatically using any startup script or a Linux service as explained below. Nothing else should be required, no library or anything to install.
-h for command line details
save [name]: save the current configuration in file named [name]
config.xml) can be created for advanced tweaking (a reference version can be generated using the
-i [config file name]command line)
-b [ip]to set what card to bind to. Note that 0.0.0.0 is not authorized
-Z), otherwise it will consume all CPU. On Linux, FreeBSD and Solaris, best is to use
-z. Note that -z option is not available on MacOS or Windows
-rto disable such adjustements (or use
<drift>option in config file), but that might cause overrun or underrun on long playbacks
The default configuration file is
config.xml, stored in the same directory as the [executable]. Each of parameters below can be set in the
<common> section to apply to all devices. It can also be set in any
<device> section to apply only to a specific device and overload the value set in
latency <[rtp][:http][:f]>: (default: (0:0))buffering tweaking, needed when audio is shuttering or for bad networks (delay playback start)
drift <0|1>: enable adding or dropping a frame when case source frames producion is too fast or too slow
enabled <0|1>: in common section, enables new discovered players by default. In a dedicated section, enables the player
name: The name that will appear for the device in AirPlay. You can change the default name.
upnp_max: set the maximum UPnP version use to search players (default 1)
codec <mp3[:<bitrate>] | flc[:0..9] | wav | pcm>: format used to send HTTP audio. FLAC is recommended but uses more CPU (pcm only available for UPnP). For example,
mp3:320for 320Kb/s MP3 encoding.
metadata <0|1>: send metadata to player (only for mp3 codec and if player supports ICY protocol)
media_volume <0..1>: (default 0.5) Applies a scaling factor to device’s hardware volume (chromecast only)
artwork: an URL to an artwork to be displayed on player
These are the global parameters
log_limit <-1 | n>: (default -1) when using log file, limits its size to ‘n’ MB (-1 = no limit)
max_players: set the maximum of players (default 32)
airupnp.servicewith the following content (assuming the airupnp binary is in
[Unit] Description=AirUPnP bridge After=network-online.target Wants=network-online.target [Service] ExecStart=/var/lib/airconnect/airupnp-arm -l 1000:2000 -Z -x /var/lib/airconnect/airupnp.xml Restart=on-failure RestartSec=30 [Install] WantedBy=multi-user.target
Enable the service
sudo systemctl enable airupnp.service
Start the service
sudo service airupnp start
To start or stop manually the service, type
sudo service airupnp start|stop in a command line window
To disable the service, type
sudo systemctl disable airupnp.service
To view the log,
journalctl -u airupnp.service
Obviously, from the above example, only use -x if you want a custom configuration. Thanks @cactus for systemd cleaning
Create the file com.aircast.bridge.plist in ~/Library/LaunchAgents/
<?xml version="1.0" encoding="UTF-8"?> <!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd"> <plist version="1.0"> <dict> <key>Label</key> <string>com.aircast.bridge</string> <key>ProgramArguments</key> <array> <string>/[path]/aircast-osx-multi</string> <string>-Z</string> <string>-x</string> <string>/[path]/aircast.xml</string> <string>-f</string> <string>/[path]/aircast.log</string> </array> <key>RunAtLoad</key> <true/> <key>LaunchOnlyOnce</key> <true/> <key>KeepAlive</key> <true/> </dict> </plist>
[path] is the path where you’ve stored the aircast executable (without the ). It can be for example
xxx is your user name
There are many tools that allow an application to be run as a service. You can try this one
The upnp version is often used with Sonos players. When a Sonos group is created, only the master of that group will appear as an AirPlay player and others will be removed if they were already detected. If the group is later split, then individual players will re-appear.
When changing volume of a group, each player’s volume is changed trying to respect the relative values. It’s not perfect and stil under test now. To reset all volumes to the same value, simply move the cursor to 0 and then to the new value. All players will have the same volume then. You need to use the Sonos application to change individual volumes.
To identify your Sonos players, pick an identified IP address, and visit the Sonos status page in your browser, like
Zone Players and you will see the identifiers for your players in the
@chpusch has found that Bose SoundTouch work well including synchonisation (as for Sonos, you need to use Bose’s native application for grouping / ungrouping). I don’t have a SoundTouch system so I cannot do the level of slave/master detection I did for Sonos
Some of these speakers only support mp3 and require a modified
ProtocolInfo to stream correctly. This can be done by editing the config file and changing
<codec>mp3</codec> and replacing the
<mp3>..</mp3> line with:
Note: you can use the
-i config.xml to generate a config file if you do not have one.
When players disappear regularly, it might be that your router is filtering out multicast packets. For example, for a Asus AC-RT68U, you have to login by ssh and run echo 0 > /sys/class/net/br0/bridge/multicast_snooping but it does not stay after a reboot.
Lots of users seems to have problem with Unify and broadcasting / finding players. Here is a guide https://www.neilgrogan.com/ubnt-sonos/ made by somebody who fixes the issue for his Sonos
I’ve received that question many times: why is there (sometimes) many seconds of delay when I switch track (or source) from my iPhone before I hear the change?
To understand, it’s better that you read the next paragraph, but as you probably won’t, here is a quick summary of how AirPlay works. As far as the sender (e.g. your iPhone) is concerned, once the connection with an AirPlay ‘speaker’ is established, this connection is almost like a analogue wire with a delay (buffer) of 1 or 2 seconds.
What iOS does nowadays is that when you switch between tracks, instead of closing the connection and re-creating one, it just pushes the new audio through the existing connection, so you might have the 1~2 seconds of previous audio in the pipe before the new audio plays. Same thing when stopping/pausing playback, iOS simply stops pushing audio through the wire.
There is a function to “flush” the audio in the pipe so that new audio plays immediately, but I’ve seen that recent versions of iOS don’t use it anymore (or some applications decide to not flush while they could). That’s not a big deal with most AirPlay speakers, it’s a 1~2 second delay.
But with AirConnect, the AirPlay speaker is not a speaker, it’s a UPnP or Chromecast player. They do not at all act like virtual wires, they instead expect to have the whole track available as a file and retrieve data from it as needed. In fact, one of the key functions that AirConnect does is looking like a wire to iPhone and looking like a file to the UPnP/CC.
Usually, UPnP/CC players consume a large chunk of that ‘file’ before they start to play to handle network congestion, but some don’t and simply start playing at the first received byte, counting that the large chunk will come quickly. But that chunk/buffer does not exist for AirConnect as audio is produced in real time by the iPhone. So if a player starts at the first byte, it will very likely lack audio data when a network congestion occurs and playback will stutter. The parameter
http latency solves this issue by creating a silence buffer sent in a burst when establishing a connection, but this creates a permanent delay between the iPhone and the player. Some UPnP/CC players wait to have buffered enough data before they start playing and again, because that data is built in real time by AirConnect, this other delay adds up to the latency parameter (even if http latency is 0).
When you switch between tracks or sources (or pause/stop), if your iPhone sends this “flush” command, then AirConnect immediately stops the UPnP/CC player. But if there is no flush command, it will play until these silence + self buffers are consumed ... that can be more than a few seconds.
In addition the delay can increase with time depending of clock speed difference between the iPhone and the UPnP/CC. Say that the iPhone’s clock is 1% faster than the player’s clock, then when it has produced 300s (5mins) of audio, the player has received it all but it has only played 297s, so there is an additional delay of 3s. If the iPhone moves track without the flush command, then the UPnP/CC player will start playing new audio (or stop)
http latency + self-buffer length + 3 seconds later ... that can be a lot!
Unfortunately, there is nothing I can do about that. By not using the “flush” command, iOS or application using AirPlay create an issue that AirConnect has no way to identify or avoid.
These bridges receive realtime “synchronous” audio from the AirPlay controller in the format of RTP frames and forward it to the Chromecast/UPnP/Sonos player in an HTTP “asynchronous” continuous audio binary format (notion of frames does not exist on that side). In other words, the AirPlay clients “push” the audio using RTP and the Chromecast/UPnP/Sonos players “pull” the audio using an HTTP GET request.
A player using HTTP to get its audio expects to receive an initial large portion of audio as the response to its GET and this creates a large enough buffer to handle most further network congestion/delays. The rest of the audio transmission is regulated by the player using TCP flow control. But when the source is an AirPlay RTP device, there is no such large portion of audio available in advance to be sent to the Player, as the audio comes to the bridge in real time. Every 8ms, a RTP frame is received and is immediately forwarded as the continuation of the HTTP body. If the CC/UPnP/Sonos players starts to play immediately the 1st received audio sample, expecting an initial burst to follow, then any network congestion delaying RTP audio will starve the player and create shuttering.
The [http] parameter allow a certain amount of silence frames to be sent to the Chromecast/UPnP/Sonos player, in a burst at the beginning. Then, while this “artificial” silence is being played, it’s possible for the bridge to build a buffer of RTP frames that will then hide network delays that might happen in further RTP frames transmission. This delays the start of the playback by [http] ms.
But RTP frames are transmitted using UDP, which means there is no guarantee of delivery, so frames might be lost from time to time (happens often on WiFi networks). To allow detection of lost frames, they are numbered sequentially (1,2 ... n) so every time two received frames are not consecutive, the missing ones can be requested again by the AirPlay receiver.
Normally, the bridge forwards immediately every RTP frame using HTTP and again, in HTTP, the notion of frame numbers does not exit, it’s just the continuous binary audio. So it’s not possible to send audio non-sequentially when using HTTP
For example, if received RTP frames are numbered 1,2,3,6, this bridge will forward (once decoded and transformed into raw audio) 1,2,3 immediately using HTTP but when it receives 6, it will re-request 4 and 5 to be resent and hold 6 while waiting (if 6 were to be transmitted immediately, the Chromecast/UPnP/Sonos will play 1,2,3,6 ... not nice). The [rtp] parameter sets for how long frame 6 shall be held before adding two silence frames for 4 and 5 and send sending 4,5,6. Obviously, if this delay is larger than the buffer in the Chromecast/UPnP/Sonos player, playback will stop by lack of audio. Note that [rtp] does not delay playback start.
When [f] is set, silence frames will be inserted as soon as no RTP frames have been received during [rtp] ms. This ensure that a continuous stream of audio is available on the HTTP server. This might be necessary for some players that close the HTTP connection if they have not received data for some time. It’s unlikely though. Note that otherwise when RTP stream is interrupted for more than [http] ms, the UPnP/CC player will stop anyway as it will have empty buffers. Still, as soon as the RTP stream resumes, the bridge will receive frame N, where the last received one might be N-500. So it will request the (up to) [rtp] missing ones (might be less than 500), and restart playing at N-[http], so potentially silence will be inserted.
Many have asked for a way to do video/audio synchronisation so that UPnP (Sonos) players can be used as speakers when playing video on a computer or tablet (YouTube for example). Due to this RTP-to-HTTP bridging, this cannot be done as the exact time when an audio frame is played cannot be controlled on the HTTP client. AirPlay speakers can achieve that because the iPhone/iPad/MAC player will “delay” the video by a known amount, send the audio in advance (usually 2 sec) and then control the exact time when this audio is output by the speaker. But although AirConnect has the exact request timing and maintains synchronization with the player, it cannot “relay” that synchronization to the speakers. UPnP protocol does not allow this and Sonos has not made their protocol public. Sometimes you might get lucky because the video-to-audio delay will almost match the HTTP player delay, but it is not reproductible and will not be stable over time.
If you want to recompile, you’ll need: