Move all the examples from the talk directory into the webrtc examples directory.

Significant changes:

- move the libjingle_examples.gyp file into webrtc directory.
- rename talk/examples/android to webrtc/examples/androidapp to avoid name conflicts.
- update paths in talk/libjingle_tests.gyp to point to webrtc directory for Objective-C test.

BUG=
R=pthatcher@webrtc.org, tkchin@webrtc.org

Review URL: https://codereview.webrtc.org/1235563006 .

Cr-Original-Commit-Position: refs/heads/master@{#9681}
Cr-Mirrored-From: https://chromium.googlesource.com/external/webrtc
Cr-Mirrored-Commit: a8736448970fedd82f051c6b2cc89185b755ddf3
diff --git a/examples/OWNERS b/examples/OWNERS
new file mode 100644
index 0000000..f489e6b
--- /dev/null
+++ b/examples/OWNERS
@@ -0,0 +1,2 @@
+glaznev@webrtc.org
+tkchin@webrtc.org
diff --git a/examples/androidapp/AndroidManifest.xml b/examples/androidapp/AndroidManifest.xml
new file mode 100644
index 0000000..631660a
--- /dev/null
+++ b/examples/androidapp/AndroidManifest.xml
@@ -0,0 +1,49 @@
+<?xml version="1.0" encoding="utf-8"?>
+<manifest xmlns:android="http://schemas.android.com/apk/res/android"
+          package="org.appspot.apprtc"
+          android:versionCode="1"
+          android:versionName="1.0">
+
+    <uses-feature android:name="android.hardware.camera" />
+    <uses-feature android:name="android.hardware.camera.autofocus" />
+    <uses-feature android:glEsVersion="0x00020000" android:required="true" />
+    <uses-sdk android:minSdkVersion="14" android:targetSdkVersion="21" />
+
+    <uses-permission android:name="android.permission.CAMERA" />
+    <uses-permission android:name="android.permission.MODIFY_AUDIO_SETTINGS" />
+    <uses-permission android:name="android.permission.RECORD_AUDIO" />
+    <uses-permission android:name="android.permission.INTERNET" />
+    <uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE"/>
+
+    <application android:label="@string/app_name"
+                 android:icon="@drawable/ic_launcher"
+                 android:allowBackup="false">
+
+        <activity android:name="ConnectActivity"
+                  android:label="@string/app_name">
+            <intent-filter>
+                <action android:name="android.intent.action.MAIN"/>
+                <category android:name="android.intent.category.LAUNCHER"/>
+            </intent-filter>
+
+            <intent-filter>
+                <action android:name="android.intent.action.VIEW"/>
+                <category android:name="android.intent.category.DEFAULT"/>
+                <category android:name="android.intent.category.BROWSABLE"/>
+                <data android:scheme="https" android:host="apprtc.appspot.com"/>
+                <data android:scheme="http" android:host="apprtc.appspot.com"/>
+            </intent-filter>
+        </activity>
+
+        <activity android:name="SettingsActivity"
+                  android:label="@string/settings_name">
+        </activity>
+
+        <activity android:name="CallActivity"
+                  android:label="@string/app_name"
+                  android:screenOrientation="fullUser"
+                  android:configChanges="orientation|screenSize"
+                  android:theme="@style/CallActivityTheme">
+        </activity>
+    </application>
+</manifest>
diff --git a/examples/androidapp/README b/examples/androidapp/README
new file mode 100644
index 0000000..3531fa1
--- /dev/null
+++ b/examples/androidapp/README
@@ -0,0 +1,34 @@
+This directory contains an example Android client for https://apprtc.appspot.com
+
+Prerequisites:
+- "Getting the code" on http://www.webrtc.org/native-code/android
+- Set up webrtc-related GYP variables:
+  export GYP_DEFINES="build_with_libjingle=1 build_with_chromium=0 libjingle_java=1
+  OS=android $GYP_DEFINES"
+  To cause WEBRTC_LOGGING to emit to Android's logcat, add enable_tracing=1 to
+  the $GYP_DEFINES above.
+- When targeting both desktop & android, make sure to use a different output_dir
+  value in $GYP_GENERATOR_FLAGS - for example
+  export GYP_GENERATOR_FLAGS="$GYP_GENERATOR_FLAGS output_dir=out_android"
+  or you'll likely end up with mismatched ARM & x86 output artifacts.
+  If you use an output_dir other than out/ make sure to modify the command-lines
+  below appropriately.
+- Finally, run "gclient runhooks" to generate Android-targeting .ninja files.
+
+Example of building & using the app:
+
+cd <path/to/webrtc>/src
+ninja -C out/Debug AppRTCDemo
+adb install -r out/Debug/apks/AppRTCDemo.apk
+
+In desktop chrome, navigate to https://apprtc.appspot.com and note the r=<NNN> room
+this redirects to or navigate directly to https://apprtc.appspot.com/r/<NNN> with
+your own room number. Launch AppRTC on the device and add same <NNN> into the room name list.
+
+You can also run application from a command line to connect to the first room in a list:
+adb shell am start -n org.appspot.apprtc/.ConnectActivity -a android.intent.action.VIEW
+This should result in the app launching on Android and connecting to the 3-dot-apprtc
+page displayed in the desktop browser.
+To run loopback test execute following command:
+adb shell am start -n org.appspot.apprtc/.ConnectActivity -a android.intent.action.VIEW --ez "org.appspot.apprtc.LOOPBACK" true
+
diff --git a/examples/androidapp/ant.properties b/examples/androidapp/ant.properties
new file mode 100644
index 0000000..b0971e8
--- /dev/null
+++ b/examples/androidapp/ant.properties
@@ -0,0 +1,17 @@
+# This file is used to override default values used by the Ant build system.
+#
+# This file must be checked into Version Control Systems, as it is
+# integral to the build system of your project.
+
+# This file is only used by the Ant script.
+
+# You can use this to override default values such as
+#  'source.dir' for the location of your java source folder and
+#  'out.dir' for the location of your output folder.
+
+# You can also use it define how the release builds are signed by declaring
+# the following properties:
+#  'key.store' for the location of your keystore and
+#  'key.alias' for the name of the key to use.
+# The password will be asked during the build when you use the 'release' target.
+
diff --git a/examples/androidapp/build.xml b/examples/androidapp/build.xml
new file mode 100644
index 0000000..ae06794
--- /dev/null
+++ b/examples/androidapp/build.xml
@@ -0,0 +1,92 @@
+<?xml version="1.0" encoding="UTF-8"?>
+<project name="AppRTCDemo" default="help">
+
+    <!-- The local.properties file is created and updated by the 'android' tool.
+         It contains the path to the SDK. It should *NOT* be checked into
+         Version Control Systems. -->
+    <property file="local.properties" />
+
+    <!-- The ant.properties file can be created by you. It is only edited by the
+         'android' tool to add properties to it.
+         This is the place to change some Ant specific build properties.
+         Here are some properties you may want to change/update:
+
+         source.dir
+             The name of the source directory. Default is 'src'.
+         out.dir
+             The name of the output directory. Default is 'bin'.
+
+         For other overridable properties, look at the beginning of the rules
+         files in the SDK, at tools/ant/build.xml
+
+         Properties related to the SDK location or the project target should
+         be updated using the 'android' tool with the 'update' action.
+
+         This file is an integral part of the build system for your
+         application and should be checked into Version Control Systems.
+
+         -->
+    <property file="ant.properties" />
+
+    <!-- if sdk.dir was not set from one of the property file, then
+         get it from the ANDROID_HOME env var.
+         This must be done before we load project.properties since
+         the proguard config can use sdk.dir -->
+    <property environment="env" />
+    <condition property="sdk.dir" value="${env.ANDROID_SDK_ROOT}">
+        <isset property="env.ANDROID_SDK_ROOT" />
+    </condition>
+
+    <!-- The project.properties file is created and updated by the 'android'
+         tool, as well as ADT.
+
+         This contains project specific properties such as project target, and library
+         dependencies. Lower level build properties are stored in ant.properties
+         (or in .classpath for Eclipse projects).
+
+         This file is an integral part of the build system for your
+         application and should be checked into Version Control Systems. -->
+    <loadproperties srcFile="project.properties" />
+
+    <!-- quick check on sdk.dir -->
+    <fail
+            message="sdk.dir is missing. Make sure to generate local.properties using 'android update project' or to inject it through the ANDROID_HOME environment variable."
+            unless="sdk.dir"
+    />
+
+    <!--
+        Import per project custom build rules if present at the root of the project.
+        This is the place to put custom intermediary targets such as:
+            -pre-build
+            -pre-compile
+            -post-compile (This is typically used for code obfuscation.
+                           Compiled code location: ${out.classes.absolute.dir}
+                           If this is not done in place, override ${out.dex.input.absolute.dir})
+            -post-package
+            -post-build
+            -pre-clean
+    -->
+    <import file="custom_rules.xml" optional="true" />
+
+    <!-- Import the actual build file.
+
+         To customize existing targets, there are two options:
+         - Customize only one target:
+             - copy/paste the target into this file, *before* the
+               <import> task.
+             - customize it to your needs.
+         - Customize the whole content of build.xml
+             - copy/paste the content of the rules files (minus the top node)
+               into this file, replacing the <import> task.
+             - customize to your needs.
+
+         ***********************
+         ****** IMPORTANT ******
+         ***********************
+         In all cases you must update the value of version-tag below to read 'custom' instead of an integer,
+         in order to avoid having your file be overridden by tools such as "android update project"
+    -->
+    <!-- version-tag: 1 -->
+    <import file="${sdk.dir}/tools/ant/build.xml" />
+
+</project>
diff --git a/examples/androidapp/project.properties b/examples/androidapp/project.properties
new file mode 100644
index 0000000..a6ca533
--- /dev/null
+++ b/examples/androidapp/project.properties
@@ -0,0 +1,16 @@
+# This file is automatically generated by Android Tools.
+# Do not modify this file -- YOUR CHANGES WILL BE ERASED!
+#
+# This file must be checked in Version Control Systems.
+#
+# To customize properties used by the Ant build system edit
+# "ant.properties", and override values to adapt the script to your
+# project structure.
+#
+# To enable ProGuard to shrink and obfuscate your code, uncomment this (available properties: sdk.dir, user.home):
+#proguard.config=${sdk.dir}/tools/proguard/proguard-android.txt:proguard-project.txt
+
+# Project target.
+target=android-22
+
+java.compilerargs=-Xlint:all -Werror
diff --git a/examples/androidapp/res/drawable-hdpi/disconnect.png b/examples/androidapp/res/drawable-hdpi/disconnect.png
new file mode 100644
index 0000000..be36174
--- /dev/null
+++ b/examples/androidapp/res/drawable-hdpi/disconnect.png
Binary files differ
diff --git a/examples/androidapp/res/drawable-hdpi/ic_action_full_screen.png b/examples/androidapp/res/drawable-hdpi/ic_action_full_screen.png
new file mode 100644
index 0000000..22f30d3
--- /dev/null
+++ b/examples/androidapp/res/drawable-hdpi/ic_action_full_screen.png
Binary files differ
diff --git a/examples/androidapp/res/drawable-hdpi/ic_action_return_from_full_screen.png b/examples/androidapp/res/drawable-hdpi/ic_action_return_from_full_screen.png
new file mode 100644
index 0000000..d9436e5
--- /dev/null
+++ b/examples/androidapp/res/drawable-hdpi/ic_action_return_from_full_screen.png
Binary files differ
diff --git a/examples/androidapp/res/drawable-hdpi/ic_launcher.png b/examples/androidapp/res/drawable-hdpi/ic_launcher.png
new file mode 100644
index 0000000..f01a31a
--- /dev/null
+++ b/examples/androidapp/res/drawable-hdpi/ic_launcher.png
Binary files differ
diff --git a/examples/androidapp/res/drawable-hdpi/ic_loopback_call.png b/examples/androidapp/res/drawable-hdpi/ic_loopback_call.png
new file mode 100644
index 0000000..3931185
--- /dev/null
+++ b/examples/androidapp/res/drawable-hdpi/ic_loopback_call.png
Binary files differ
diff --git a/examples/androidapp/res/drawable-ldpi/disconnect.png b/examples/androidapp/res/drawable-ldpi/disconnect.png
new file mode 100644
index 0000000..be36174
--- /dev/null
+++ b/examples/androidapp/res/drawable-ldpi/disconnect.png
Binary files differ
diff --git a/examples/androidapp/res/drawable-ldpi/ic_action_full_screen.png b/examples/androidapp/res/drawable-ldpi/ic_action_full_screen.png
new file mode 100644
index 0000000..e4a9ff0
--- /dev/null
+++ b/examples/androidapp/res/drawable-ldpi/ic_action_full_screen.png
Binary files differ
diff --git a/examples/androidapp/res/drawable-ldpi/ic_action_return_from_full_screen.png b/examples/androidapp/res/drawable-ldpi/ic_action_return_from_full_screen.png
new file mode 100644
index 0000000..f5c80f0
--- /dev/null
+++ b/examples/androidapp/res/drawable-ldpi/ic_action_return_from_full_screen.png
Binary files differ
diff --git a/examples/androidapp/res/drawable-ldpi/ic_launcher.png b/examples/androidapp/res/drawable-ldpi/ic_launcher.png
new file mode 100644
index 0000000..5492ed7
--- /dev/null
+++ b/examples/androidapp/res/drawable-ldpi/ic_launcher.png
Binary files differ
diff --git a/examples/androidapp/res/drawable-ldpi/ic_loopback_call.png b/examples/androidapp/res/drawable-ldpi/ic_loopback_call.png
new file mode 100644
index 0000000..3931185
--- /dev/null
+++ b/examples/androidapp/res/drawable-ldpi/ic_loopback_call.png
Binary files differ
diff --git a/examples/androidapp/res/drawable-mdpi/disconnect.png b/examples/androidapp/res/drawable-mdpi/disconnect.png
new file mode 100644
index 0000000..be36174
--- /dev/null
+++ b/examples/androidapp/res/drawable-mdpi/disconnect.png
Binary files differ
diff --git a/examples/androidapp/res/drawable-mdpi/ic_action_full_screen.png b/examples/androidapp/res/drawable-mdpi/ic_action_full_screen.png
new file mode 100644
index 0000000..e4a9ff0
--- /dev/null
+++ b/examples/androidapp/res/drawable-mdpi/ic_action_full_screen.png
Binary files differ
diff --git a/examples/androidapp/res/drawable-mdpi/ic_action_return_from_full_screen.png b/examples/androidapp/res/drawable-mdpi/ic_action_return_from_full_screen.png
new file mode 100644
index 0000000..f5c80f0
--- /dev/null
+++ b/examples/androidapp/res/drawable-mdpi/ic_action_return_from_full_screen.png
Binary files differ
diff --git a/examples/androidapp/res/drawable-mdpi/ic_launcher.png b/examples/androidapp/res/drawable-mdpi/ic_launcher.png
new file mode 100644
index 0000000..b8b4b0e
--- /dev/null
+++ b/examples/androidapp/res/drawable-mdpi/ic_launcher.png
Binary files differ
diff --git a/examples/androidapp/res/drawable-mdpi/ic_loopback_call.png b/examples/androidapp/res/drawable-mdpi/ic_loopback_call.png
new file mode 100644
index 0000000..3931185
--- /dev/null
+++ b/examples/androidapp/res/drawable-mdpi/ic_loopback_call.png
Binary files differ
diff --git a/examples/androidapp/res/drawable-xhdpi/disconnect.png b/examples/androidapp/res/drawable-xhdpi/disconnect.png
new file mode 100644
index 0000000..be36174
--- /dev/null
+++ b/examples/androidapp/res/drawable-xhdpi/disconnect.png
Binary files differ
diff --git a/examples/androidapp/res/drawable-xhdpi/ic_action_full_screen.png b/examples/androidapp/res/drawable-xhdpi/ic_action_full_screen.png
new file mode 100644
index 0000000..6d90c07
--- /dev/null
+++ b/examples/androidapp/res/drawable-xhdpi/ic_action_full_screen.png
Binary files differ
diff --git a/examples/androidapp/res/drawable-xhdpi/ic_action_return_from_full_screen.png b/examples/androidapp/res/drawable-xhdpi/ic_action_return_from_full_screen.png
new file mode 100644
index 0000000..a773b34
--- /dev/null
+++ b/examples/androidapp/res/drawable-xhdpi/ic_action_return_from_full_screen.png
Binary files differ
diff --git a/examples/androidapp/res/drawable-xhdpi/ic_launcher.png b/examples/androidapp/res/drawable-xhdpi/ic_launcher.png
new file mode 100644
index 0000000..a3cd458
--- /dev/null
+++ b/examples/androidapp/res/drawable-xhdpi/ic_launcher.png
Binary files differ
diff --git a/examples/androidapp/res/drawable-xhdpi/ic_loopback_call.png b/examples/androidapp/res/drawable-xhdpi/ic_loopback_call.png
new file mode 100644
index 0000000..3931185
--- /dev/null
+++ b/examples/androidapp/res/drawable-xhdpi/ic_loopback_call.png
Binary files differ
diff --git a/examples/androidapp/res/layout/activity_call.xml b/examples/androidapp/res/layout/activity_call.xml
new file mode 100644
index 0000000..a18f758
--- /dev/null
+++ b/examples/androidapp/res/layout/activity_call.xml
@@ -0,0 +1,23 @@
+<?xml version="1.0" encoding="utf-8"?>
+
+<RelativeLayout
+        xmlns:android="http://schemas.android.com/apk/res/android"
+        xmlns:tools="http://schemas.android.com/tools"
+        android:layout_width="match_parent"
+        android:layout_height="match_parent">
+
+    <android.opengl.GLSurfaceView
+        android:id="@+id/glview_call"
+        android:layout_width="match_parent"
+        android:layout_height="match_parent" />
+
+    <FrameLayout
+        android:id="@+id/call_fragment_container"
+        android:layout_width="match_parent"
+        android:layout_height="match_parent" />
+    <FrameLayout
+        android:id="@+id/hud_fragment_container"
+        android:layout_width="match_parent"
+        android:layout_height="match_parent" />
+
+</RelativeLayout>
diff --git a/examples/androidapp/res/layout/activity_connect.xml b/examples/androidapp/res/layout/activity_connect.xml
new file mode 100644
index 0000000..5b80771
--- /dev/null
+++ b/examples/androidapp/res/layout/activity_connect.xml
@@ -0,0 +1,72 @@
+<?xml version="1.0" encoding="utf-8"?>
+<LinearLayout
+        xmlns:android="http://schemas.android.com/apk/res/android"
+        android:layout_width="match_parent"
+        android:layout_height="match_parent"
+        android:orientation="vertical"
+        android:weightSum="1"
+        android:layout_margin="8dp"
+        android:layout_centerHorizontal="true">
+
+    <LinearLayout
+        android:layout_width="match_parent"
+        android:layout_height="wrap_content"
+        android:orientation="horizontal" >
+            <ImageButton
+                android:id="@+id/add_room_button"
+                android:background="@android:drawable/ic_menu_add"
+                android:contentDescription="@string/add_room_description"
+                android:layout_marginEnd="20dp"
+                android:layout_width="48dp"
+                android:layout_height="48dp"/>
+            <ImageButton
+                android:id="@+id/remove_room_button"
+                android:background="@android:drawable/ic_delete"
+                android:contentDescription="@string/remove_room_description"
+                android:layout_marginEnd="20dp"
+                android:layout_width="48dp"
+                android:layout_height="48dp"/>
+            <ImageButton
+                android:id="@+id/connect_button"
+                android:background="@android:drawable/sym_action_call"
+                android:contentDescription="@string/connect_description"
+                android:layout_marginEnd="20dp"
+                android:layout_width="48dp"
+                android:layout_height="48dp"/>
+            <ImageButton
+                android:id="@+id/connect_loopback_button"
+                android:background="@drawable/ic_loopback_call"
+                android:contentDescription="@string/connect_loopback_description"
+                android:layout_width="48dp"
+                android:layout_height="48dp"/>
+    </LinearLayout>
+    <TextView
+            android:id="@+id/room_edittext_description"
+            android:layout_width="fill_parent"
+            android:layout_height="wrap_content"
+            android:layout_margin="5dp"
+            android:text="@string/room_description"/>
+    <EditText
+            android:id="@+id/room_edittext"
+            android:layout_width="match_parent"
+            android:layout_height="wrap_content"
+            android:singleLine="true"
+            android:imeOptions="actionDone"/>
+    <TextView
+            android:id="@+id/room_listview_description"
+            android:layout_width="fill_parent"
+            android:layout_height="wrap_content"
+            android:layout_marginTop="5dp"
+            android:lines="1"
+            android:maxLines="1"
+            android:textAppearance="?android:attr/textAppearanceMedium"
+            android:text="@string/room_names"/>
+    <ListView
+            android:id="@+id/room_listview"
+            android:layout_width="fill_parent"
+            android:layout_height="wrap_content"
+            android:choiceMode="singleChoice"
+            android:listSelector="@android:color/darker_gray"
+            android:drawSelectorOnTop="false" />
+
+</LinearLayout>
\ No newline at end of file
diff --git a/examples/androidapp/res/layout/fragment_call.xml b/examples/androidapp/res/layout/fragment_call.xml
new file mode 100644
index 0000000..70a9a28
--- /dev/null
+++ b/examples/androidapp/res/layout/fragment_call.xml
@@ -0,0 +1,51 @@
+<?xml version="1.0" encoding="utf-8"?>
+
+<RelativeLayout
+        xmlns:android="http://schemas.android.com/apk/res/android"
+        xmlns:tools="http://schemas.android.com/tools"
+        android:layout_width="match_parent"
+        android:layout_height="match_parent">
+
+    <TextView
+        android:id="@+id/contact_name_call"
+        android:layout_width="wrap_content"
+        android:layout_height="wrap_content"
+        android:layout_centerHorizontal="true"
+        android:layout_above="@+id/buttons_call_container"
+        android:textSize="24sp"
+        android:layout_margin="8dp"/>
+
+    <LinearLayout
+           android:id="@+id/buttons_call_container"
+           android:orientation="horizontal"
+           android:layout_alignParentBottom="true"
+           android:layout_marginBottom="32dp"
+           android:layout_centerHorizontal="true"
+           android:layout_width="wrap_content"
+           android:layout_height="wrap_content">
+
+       <ImageButton
+           android:id="@+id/button_call_disconnect"
+           android:background="@drawable/disconnect"
+           android:contentDescription="@string/disconnect_call"
+           android:layout_marginEnd="16dp"
+           android:layout_width="48dp"
+           android:layout_height="48dp"/>
+
+       <ImageButton
+           android:id="@+id/button_call_switch_camera"
+           android:background="@android:drawable/ic_menu_camera"
+           android:contentDescription="@string/switch_camera"
+           android:layout_marginEnd="8dp"
+           android:layout_width="48dp"
+           android:layout_height="48dp"/>
+
+        <ImageButton
+           android:id="@+id/button_call_scaling_mode"
+           android:background="@drawable/ic_action_return_from_full_screen"
+           android:contentDescription="@string/disconnect_call"
+           android:layout_width="48dp"
+           android:layout_height="48dp"/>
+    </LinearLayout>
+
+</RelativeLayout>
diff --git a/examples/androidapp/res/layout/fragment_hud.xml b/examples/androidapp/res/layout/fragment_hud.xml
new file mode 100644
index 0000000..273a2f3
--- /dev/null
+++ b/examples/androidapp/res/layout/fragment_hud.xml
@@ -0,0 +1,75 @@
+<?xml version="1.0" encoding="utf-8"?>
+
+<RelativeLayout
+        xmlns:android="http://schemas.android.com/apk/res/android"
+        xmlns:tools="http://schemas.android.com/tools"
+        android:layout_width="match_parent"
+        android:layout_height="match_parent">
+
+    <ImageButton
+        android:id="@+id/button_toggle_debug"
+        android:background="@android:drawable/ic_menu_info_details"
+        android:contentDescription="@string/toggle_debug"
+        android:layout_alignParentBottom="true"
+        android:layout_alignParentStart="true"
+        android:layout_width="48dp"
+        android:layout_height="48dp"/>
+
+    <TextView
+        android:id="@+id/encoder_stat_call"
+        android:layout_width="wrap_content"
+        android:layout_height="wrap_content"
+        android:layout_alignParentEnd="true"
+        android:textStyle="bold"
+        android:textColor="#C000FF00"
+        android:textSize="12dp"
+        android:layout_margin="8dp"/>
+
+    <TableLayout
+           android:id="@+id/hudview_container"
+           android:layout_width="match_parent"
+           android:layout_height="match_parent">
+
+           <TableRow>
+              <TextView
+                 android:id="@+id/hud_stat_bwe"
+                 android:layout_width="wrap_content"
+                 android:layout_height="wrap_content"
+                 android:alpha="0.4"
+                 android:padding="2dip"
+                 android:background="@android:color/white"
+                 android:textColor="@android:color/black" />
+
+              <TextView
+                 android:id="@+id/hud_stat_connection"
+                 android:layout_width="wrap_content"
+                 android:layout_height="wrap_content"
+                 android:alpha="0.4"
+                 android:padding="2dip"
+                 android:background="@android:color/white"
+                 android:textColor="@android:color/black" />
+
+           </TableRow>
+
+           <TableRow>
+              <TextView
+                 android:id="@+id/hud_stat_video_send"
+                 android:layout_width="wrap_content"
+                 android:layout_height="wrap_content"
+                 android:alpha="0.4"
+                 android:padding="2dip"
+                 android:background="@android:color/white"
+                 android:textColor="@android:color/black" />
+
+              <TextView
+                 android:id="@+id/hud_stat_video_recv"
+                 android:layout_width="wrap_content"
+                 android:layout_height="wrap_content"
+                 android:padding="2dip"
+                 android:alpha="0.4"
+                 android:background="@android:color/white"
+                 android:textColor="@android:color/black" />
+            </TableRow>
+    </TableLayout>
+
+</RelativeLayout>
diff --git a/examples/androidapp/res/menu/connect_menu.xml b/examples/androidapp/res/menu/connect_menu.xml
new file mode 100644
index 0000000..d9f9486
--- /dev/null
+++ b/examples/androidapp/res/menu/connect_menu.xml
@@ -0,0 +1,8 @@
+<menu xmlns:android="http://schemas.android.com/apk/res/android" >
+    <item
+        android:id="@+id/action_settings"
+        android:orderInCategory="100"
+        android:icon="@android:drawable/ic_menu_preferences"
+        android:showAsAction="ifRoom"
+        android:title="@string/action_settings"/>
+</menu>
diff --git a/examples/androidapp/res/values-v17/styles.xml b/examples/androidapp/res/values-v17/styles.xml
new file mode 100644
index 0000000..1d72850
--- /dev/null
+++ b/examples/androidapp/res/values-v17/styles.xml
@@ -0,0 +1,8 @@
+<?xml version="1.0" encoding="utf-8"?>
+<resources>
+  <style name="CallActivityTheme" parent="android:Theme.Black">
+    <item name="android:windowActionBar">false</item>
+    <item name="android:windowFullscreen">true</item>
+    <item name="android:windowNoTitle">true</item>
+  </style>
+</resources>
diff --git a/examples/androidapp/res/values-v21/styles.xml b/examples/androidapp/res/values-v21/styles.xml
new file mode 100644
index 0000000..95f1ac6
--- /dev/null
+++ b/examples/androidapp/res/values-v21/styles.xml
@@ -0,0 +1,8 @@
+<?xml version="1.0" encoding="utf-8"?>
+<resources>
+  <style name="AppRTCDemoActivityTheme" parent="android:Theme.Material">
+    <item name="android:windowActionBar">false</item>
+    <item name="android:windowFullscreen">true</item>
+    <item name="android:windowNoTitle">true</item>
+  </style>
+</resources>
diff --git a/examples/androidapp/res/values/arrays.xml b/examples/androidapp/res/values/arrays.xml
new file mode 100644
index 0000000..ba8c891
--- /dev/null
+++ b/examples/androidapp/res/values/arrays.xml
@@ -0,0 +1,39 @@
+<?xml version="1.0" encoding="utf-8"?>
+<resources>
+    <string-array name="videoResolutions">
+        <item>Default</item>
+        <item>HD (1280 x 720)</item>
+        <item>VGA (640 x 480)</item>
+        <item>QVGA (320 x 240)</item>
+    </string-array>
+
+    <string-array name="videoResolutionsValues">
+        <item>Default</item>
+        <item>1280 x 720</item>
+        <item>640 x 480</item>
+        <item>320 x 240</item>
+    </string-array>
+
+    <string-array name="cameraFps">
+        <item>Default</item>
+        <item>30 fps</item>
+        <item>15 fps</item>
+    </string-array>
+
+    <string-array name="startBitrate">
+        <item>Default</item>
+        <item>Manual</item>
+    </string-array>
+
+    <string-array name="videoCodecs">
+        <item>VP8</item>
+        <item>VP9</item>
+        <item>H264</item>
+    </string-array>
+
+    <string-array name="audioCodecs">
+        <item>OPUS</item>
+        <item>ISAC</item>
+    </string-array>
+
+</resources>
diff --git a/examples/androidapp/res/values/strings.xml b/examples/androidapp/res/values/strings.xml
new file mode 100644
index 0000000..4f2f377
--- /dev/null
+++ b/examples/androidapp/res/values/strings.xml
@@ -0,0 +1,113 @@
+<?xml version="1.0" encoding="utf-8"?>
+<resources>
+    <string name="app_name" translatable="no">AppRTC</string>
+    <string name="settings_name" translatable="no">AppRTC Settings</string>
+    <string name="disconnect_call">Disconnect Call</string>
+    <string name="room_names">Room names:</string>
+    <string name="room_description">
+        Please enter a room name. Room names are shared with everyone, so think
+        of something unique and send it to a friend.
+    </string>
+    <string name="connect_text">Connect</string>
+    <string name="invalid_url_title">Invalid URL</string>
+    <string name="invalid_url_text">The URL or room name you entered resulted in an invalid URL: %1$s
+    </string>
+    <string name="channel_error_title">Connection error</string>
+    <string name="connecting_to">Connecting to: %1$s</string>
+    <string name="missing_url">FATAL ERROR: Missing URL to connect to.</string>
+    <string name="ok">OK</string>
+    <string name="switch_camera">Switch front/back camera</string>
+    <string name="toggle_debug">Toggle debug view</string>
+    <string name="action_settings">Settings</string>
+    <string name="add_room_description">Add new room to the list</string>
+    <string name="remove_room_description">Remove room from the list</string>
+    <string name="connect_description">Connect to the room</string>
+    <string name="connect_loopback_description">Loopback connection</string>
+
+    <!-- Settings strings. -->
+    <string name="pref_room_key">room_preference</string>
+    <string name="pref_room_list_key">room_list_preference</string>
+
+    <string name="pref_videosettings_key">video_settings_key</string>
+    <string name="pref_videosettings_title">WebRTC video settings.</string>
+
+    <string name="pref_videocall_key">videocall_preference</string>
+    <string name="pref_videocall_title">Video call.</string>
+    <string name="pref_videocall_dlg">Enable video in a call.</string>
+    <string name="pref_videocall_default">true</string>
+
+    <string name="pref_resolution_key">resolution_preference</string>
+    <string name="pref_resolution_title">Video resolution.</string>
+    <string name="pref_resolution_dlg">Enter AppRTC local video resolution.</string>
+    <string name="pref_resolution_default">Default</string>
+
+    <string name="pref_fps_key">fps_preference</string>
+    <string name="pref_fps_title">Camera fps.</string>
+    <string name="pref_fps_dlg">Enter local camera fps.</string>
+    <string name="pref_fps_default">Default</string>
+
+    <string name="pref_startvideobitrate_key">startvideobitrate_preference</string>
+    <string name="pref_startvideobitrate_title">Start video bitrate setting.</string>
+    <string name="pref_startvideobitrate_dlg">Start video bitrate setting.</string>
+    <string name="pref_startvideobitrate_default">Default</string>
+
+    <string name="pref_startvideobitratevalue_key">startvideobitratevalue_preference</string>
+    <string name="pref_startvideobitratevalue_title">Video encoder start bitrate.</string>
+    <string name="pref_startvideobitratevalue_dlg">Enter video encoder start bitrate in kbps.</string>
+    <string name="pref_startvideobitratevalue_default">1000</string>
+
+    <string name="pref_videocodec_key">videocodec_preference</string>
+    <string name="pref_videocodec_title">Default video codec.</string>
+    <string name="pref_videocodec_dlg">Select default video codec.</string>
+    <string name="pref_videocodec_default">VP8</string>
+
+    <string name="pref_hwcodec_key">hwcodec_preference</string>
+    <string name="pref_hwcodec_title">Video codec hardware acceleration.</string>
+    <string name="pref_hwcodec_dlg">Use hardware accelerated video codec (if available).</string>
+    <string name="pref_hwcodec_default">true</string>
+
+    <string name="pref_value_enabled">Enabled</string>
+    <string name="pref_value_disabled">Disabled</string>
+
+    <string name="pref_audiosettings_key">audio_settings_key</string>
+    <string name="pref_audiosettings_title">WebRTC audio settings.</string>
+
+    <string name="pref_startaudiobitrate_key">startaudiobitrate_preference</string>
+    <string name="pref_startaudiobitrate_title">Audio bitrate setting.</string>
+    <string name="pref_startaudiobitrate_dlg">Audio bitrate setting.</string>
+    <string name="pref_startaudiobitrate_default">Default</string>
+
+    <string name="pref_startaudiobitratevalue_key">startaudiobitratevalue_preference</string>
+    <string name="pref_startaudiobitratevalue_title">Audio codec bitrate.</string>
+    <string name="pref_startaudiobitratevalue_dlg">Enter audio codec bitrate in kbps.</string>
+    <string name="pref_startaudiobitratevalue_default">32</string>
+
+    <string name="pref_audiocodec_key">audiocodec_preference</string>
+    <string name="pref_audiocodec_title">Default audio codec.</string>
+    <string name="pref_audiocodec_dlg">Select default audio codec.</string>
+    <string name="pref_audiocodec_default">OPUS</string>
+
+    <string name="pref_noaudioprocessing_key">audioprocessing_preference</string>
+    <string name="pref_noaudioprocessing_title">Disable audio processing.</string>
+    <string name="pref_noaudioprocessing_dlg">Disable audio processing pipeline.</string>
+    <string name="pref_noaudioprocessing_default">false</string>
+
+    <string name="pref_miscsettings_key">misc_settings_key</string>
+    <string name="pref_miscsettings_title">Miscellaneous settings.</string>
+
+    <string name="pref_cpu_usage_detection_key">cpu_usage_detection</string>
+    <string name="pref_cpu_usage_detection_title">CPU overuse detection.</string>
+    <string name="pref_cpu_usage_detection_dlg">Adapt transmission to CPU status.</string>
+    <string name="pref_cpu_usage_detection_default" translatable="false">true</string>
+
+    <string name="pref_room_server_url_key">room_server_url_preference</string>
+    <string name="pref_room_server_url_title">Room server URL.</string>
+    <string name="pref_room_server_url_dlg">Enter a room server URL.</string>
+    <string name="pref_room_server_url_default" translatable="false">https://apprtc.appspot.com</string>
+
+    <string name="pref_displayhud_key">displayhud_preference</string>
+    <string name="pref_displayhud_title">Display call statistics.</string>
+    <string name="pref_displayhud_dlg">Display call statistics.</string>
+    <string name="pref_displayhud_default" translatable="false">false</string>
+
+</resources>
diff --git a/examples/androidapp/res/xml/preferences.xml b/examples/androidapp/res/xml/preferences.xml
new file mode 100644
index 0000000..73d8d5e
--- /dev/null
+++ b/examples/androidapp/res/xml/preferences.xml
@@ -0,0 +1,117 @@
+<?xml version="1.0" encoding="utf-8"?>
+<PreferenceScreen xmlns:android="http://schemas.android.com/apk/res/android">
+    <PreferenceCategory
+        android:key="@string/pref_videosettings_key"
+        android:title="@string/pref_videosettings_title">
+
+        <CheckBoxPreference
+            android:key="@string/pref_videocall_key"
+            android:title="@string/pref_videocall_title"
+            android:dialogTitle="@string/pref_videocall_dlg"
+            android:defaultValue="@string/pref_videocall_default" />
+
+        <ListPreference
+            android:key="@string/pref_resolution_key"
+            android:title="@string/pref_resolution_title"
+            android:defaultValue="@string/pref_resolution_default"
+            android:dialogTitle="@string/pref_resolution_dlg"
+            android:entries="@array/videoResolutions"
+            android:entryValues="@array/videoResolutionsValues" />
+
+        <ListPreference
+            android:key="@string/pref_fps_key"
+            android:title="@string/pref_fps_title"
+            android:defaultValue="@string/pref_fps_default"
+            android:dialogTitle="@string/pref_fps_dlg"
+            android:entries="@array/cameraFps"
+            android:entryValues="@array/cameraFps" />
+
+        <ListPreference
+            android:key="@string/pref_startvideobitrate_key"
+            android:title="@string/pref_startvideobitrate_title"
+            android:defaultValue="@string/pref_startvideobitrate_default"
+            android:dialogTitle="@string/pref_startvideobitrate_dlg"
+            android:entries="@array/startBitrate"
+            android:entryValues="@array/startBitrate" />
+
+        <EditTextPreference
+            android:key="@string/pref_startvideobitratevalue_key"
+            android:title="@string/pref_startvideobitratevalue_title"
+            android:inputType="number"
+            android:defaultValue="@string/pref_startvideobitratevalue_default"
+            android:dialogTitle="@string/pref_startvideobitratevalue_dlg" />
+
+        <ListPreference
+            android:key="@string/pref_videocodec_key"
+            android:title="@string/pref_videocodec_title"
+            android:defaultValue="@string/pref_videocodec_default"
+            android:dialogTitle="@string/pref_videocodec_dlg"
+            android:entries="@array/videoCodecs"
+            android:entryValues="@array/videoCodecs" />
+
+        <CheckBoxPreference
+            android:key="@string/pref_hwcodec_key"
+            android:title="@string/pref_hwcodec_title"
+            android:dialogTitle="@string/pref_hwcodec_dlg"
+            android:defaultValue="@string/pref_hwcodec_default" />
+    </PreferenceCategory>
+
+    <PreferenceCategory
+        android:key="@string/pref_audiosettings_key"
+        android:title="@string/pref_audiosettings_title">
+
+        <ListPreference
+            android:key="@string/pref_startaudiobitrate_key"
+            android:title="@string/pref_startaudiobitrate_title"
+            android:defaultValue="@string/pref_startaudiobitrate_default"
+            android:dialogTitle="@string/pref_startaudiobitrate_dlg"
+            android:entries="@array/startBitrate"
+            android:entryValues="@array/startBitrate" />
+
+        <EditTextPreference
+            android:key="@string/pref_startaudiobitratevalue_key"
+            android:title="@string/pref_startaudiobitratevalue_title"
+            android:inputType="number"
+            android:defaultValue="@string/pref_startaudiobitratevalue_default"
+            android:dialogTitle="@string/pref_startaudiobitratevalue_dlg" />
+
+        <ListPreference
+            android:key="@string/pref_audiocodec_key"
+            android:title="@string/pref_audiocodec_title"
+            android:defaultValue="@string/pref_audiocodec_default"
+            android:dialogTitle="@string/pref_audiocodec_dlg"
+            android:entries="@array/audioCodecs"
+            android:entryValues="@array/audioCodecs" />
+
+        <CheckBoxPreference
+            android:key="@string/pref_noaudioprocessing_key"
+            android:title="@string/pref_noaudioprocessing_title"
+            android:dialogTitle="@string/pref_noaudioprocessing_dlg"
+            android:defaultValue="@string/pref_noaudioprocessing_default" />
+    </PreferenceCategory>
+
+    <PreferenceCategory
+        android:key="@string/pref_miscsettings_key"
+        android:title="@string/pref_miscsettings_title">
+
+        <CheckBoxPreference
+            android:key="@string/pref_cpu_usage_detection_key"
+            android:title="@string/pref_cpu_usage_detection_title"
+            android:dialogTitle="@string/pref_cpu_usage_detection_dlg"
+            android:defaultValue="@string/pref_cpu_usage_detection_default" />
+
+        <EditTextPreference
+            android:key="@string/pref_room_server_url_key"
+            android:title="@string/pref_room_server_url_title"
+            android:inputType="text"
+            android:defaultValue="@string/pref_room_server_url_default"
+            android:dialogTitle="@string/pref_room_server_url_dlg" />
+
+        <CheckBoxPreference
+            android:key="@string/pref_displayhud_key"
+            android:title="@string/pref_displayhud_title"
+            android:dialogTitle="@string/pref_displayhud_dlg"
+           android:defaultValue="@string/pref_displayhud_default" />
+    </PreferenceCategory>
+
+</PreferenceScreen>
diff --git a/examples/androidapp/src/org/appspot/apprtc/AppRTCAudioManager.java b/examples/androidapp/src/org/appspot/apprtc/AppRTCAudioManager.java
new file mode 100644
index 0000000..961c512
--- /dev/null
+++ b/examples/androidapp/src/org/appspot/apprtc/AppRTCAudioManager.java
@@ -0,0 +1,356 @@
+/*
+ *  Copyright 2014 The WebRTC Project Authors. All rights reserved.
+ *
+ *  Use of this source code is governed by a BSD-style license
+ *  that can be found in the LICENSE file in the root of the source
+ *  tree. An additional intellectual property rights grant can be found
+ *  in the file PATENTS.  All contributing project authors may
+ *  be found in the AUTHORS file in the root of the source tree.
+ */
+
+package org.appspot.apprtc;
+
+import org.appspot.apprtc.util.AppRTCUtils;
+
+import android.content.BroadcastReceiver;
+import android.content.Context;
+import android.content.Intent;
+import android.content.IntentFilter;
+import android.content.pm.PackageManager;
+import android.media.AudioManager;
+import android.util.Log;
+
+import java.util.Collections;
+import java.util.HashSet;
+import java.util.Set;
+
+/**
+ * AppRTCAudioManager manages all audio related parts of the AppRTC demo.
+ */
+public class AppRTCAudioManager {
+  private static final String TAG = "AppRTCAudioManager";
+
+  /**
+   * AudioDevice is the names of possible audio devices that we currently
+   * support.
+   */
+  // TODO(henrika): add support for BLUETOOTH as well.
+  public enum AudioDevice {
+    SPEAKER_PHONE,
+    WIRED_HEADSET,
+    EARPIECE,
+  }
+
+  private final Context apprtcContext;
+  private final Runnable onStateChangeListener;
+  private boolean initialized = false;
+  private AudioManager audioManager;
+  private int savedAudioMode = AudioManager.MODE_INVALID;
+  private boolean savedIsSpeakerPhoneOn = false;
+  private boolean savedIsMicrophoneMute = false;
+
+  // For now; always use the speaker phone as default device selection when
+  // there is a choice between SPEAKER_PHONE and EARPIECE.
+  // TODO(henrika): it is possible that EARPIECE should be preferred in some
+  // cases. If so, we should set this value at construction instead.
+  private final AudioDevice defaultAudioDevice = AudioDevice.SPEAKER_PHONE;
+
+  // Proximity sensor object. It measures the proximity of an object in cm
+  // relative to the view screen of a device and can therefore be used to
+  // assist device switching (close to ear <=> use headset earpiece if
+  // available, far from ear <=> use speaker phone).
+  private AppRTCProximitySensor proximitySensor = null;
+
+  // Contains the currently selected audio device.
+  private AudioDevice selectedAudioDevice;
+
+  // Contains a list of available audio devices. A Set collection is used to
+  // avoid duplicate elements.
+  private final Set<AudioDevice> audioDevices = new HashSet<AudioDevice>();
+
+  // Broadcast receiver for wired headset intent broadcasts.
+  private BroadcastReceiver wiredHeadsetReceiver;
+
+  // This method is called when the proximity sensor reports a state change,
+  // e.g. from "NEAR to FAR" or from "FAR to NEAR".
+  private void onProximitySensorChangedState() {
+    // The proximity sensor should only be activated when there are exactly two
+    // available audio devices.
+    if (audioDevices.size() == 2
+        && audioDevices.contains(AppRTCAudioManager.AudioDevice.EARPIECE)
+        && audioDevices.contains(
+            AppRTCAudioManager.AudioDevice.SPEAKER_PHONE)) {
+      if (proximitySensor.sensorReportsNearState()) {
+        // Sensor reports that a "handset is being held up to a person's ear",
+        // or "something is covering the light sensor".
+        setAudioDevice(AppRTCAudioManager.AudioDevice.EARPIECE);
+      } else {
+        // Sensor reports that a "handset is removed from a person's ear", or
+        // "the light sensor is no longer covered".
+        setAudioDevice(AppRTCAudioManager.AudioDevice.SPEAKER_PHONE);
+      }
+    }
+  }
+
+  /** Construction */
+  static AppRTCAudioManager create(Context context,
+      Runnable deviceStateChangeListener) {
+    return new AppRTCAudioManager(context, deviceStateChangeListener);
+  }
+
+  private AppRTCAudioManager(Context context,
+      Runnable deviceStateChangeListener) {
+    apprtcContext = context;
+    onStateChangeListener = deviceStateChangeListener;
+    audioManager = ((AudioManager) context.getSystemService(
+        Context.AUDIO_SERVICE));
+
+    // Create and initialize the proximity sensor.
+    // Tablet devices (e.g. Nexus 7) does not support proximity sensors.
+    // Note that, the sensor will not be active until start() has been called.
+    proximitySensor = AppRTCProximitySensor.create(context, new Runnable() {
+      // This method will be called each time a state change is detected.
+      // Example: user holds his hand over the device (closer than ~5 cm),
+      // or removes his hand from the device.
+      public void run() {
+        onProximitySensorChangedState();
+      }
+    });
+    AppRTCUtils.logDeviceInfo(TAG);
+  }
+
+  public void init() {
+    Log.d(TAG, "init");
+    if (initialized) {
+      return;
+    }
+
+    // Store current audio state so we can restore it when close() is called.
+    savedAudioMode = audioManager.getMode();
+    savedIsSpeakerPhoneOn = audioManager.isSpeakerphoneOn();
+    savedIsMicrophoneMute = audioManager.isMicrophoneMute();
+
+    // Request audio focus before making any device switch.
+    audioManager.requestAudioFocus(null, AudioManager.STREAM_VOICE_CALL,
+        AudioManager.AUDIOFOCUS_GAIN_TRANSIENT);
+
+    // Start by setting MODE_IN_COMMUNICATION as default audio mode. It is
+    // required to be in this mode when playout and/or recording starts for
+    // best possible VoIP performance.
+    // TODO(henrika): we migh want to start with RINGTONE mode here instead.
+    audioManager.setMode(AudioManager.MODE_IN_COMMUNICATION);
+
+    // Always disable microphone mute during a WebRTC call.
+    setMicrophoneMute(false);
+
+    // Do initial selection of audio device. This setting can later be changed
+    // either by adding/removing a wired headset or by covering/uncovering the
+    // proximity sensor.
+    updateAudioDeviceState(hasWiredHeadset());
+
+    // Register receiver for broadcast intents related to adding/removing a
+    // wired headset (Intent.ACTION_HEADSET_PLUG).
+    registerForWiredHeadsetIntentBroadcast();
+
+    initialized = true;
+  }
+
+  public void close() {
+    Log.d(TAG, "close");
+    if (!initialized) {
+      return;
+    }
+
+    unregisterForWiredHeadsetIntentBroadcast();
+
+    // Restore previously stored audio states.
+    setSpeakerphoneOn(savedIsSpeakerPhoneOn);
+    setMicrophoneMute(savedIsMicrophoneMute);
+    audioManager.setMode(savedAudioMode);
+    audioManager.abandonAudioFocus(null);
+
+    if (proximitySensor != null) {
+      proximitySensor.stop();
+      proximitySensor = null;
+    }
+
+    initialized = false;
+  }
+
+  /** Changes selection of the currently active audio device. */
+  public void setAudioDevice(AudioDevice device) {
+    Log.d(TAG, "setAudioDevice(device=" + device + ")");
+    AppRTCUtils.assertIsTrue(audioDevices.contains(device));
+
+    switch (device) {
+      case SPEAKER_PHONE:
+        setSpeakerphoneOn(true);
+        selectedAudioDevice = AudioDevice.SPEAKER_PHONE;
+        break;
+      case EARPIECE:
+        setSpeakerphoneOn(false);
+        selectedAudioDevice = AudioDevice.EARPIECE;
+        break;
+      case WIRED_HEADSET:
+        setSpeakerphoneOn(false);
+        selectedAudioDevice = AudioDevice.WIRED_HEADSET;
+        break;
+      default:
+        Log.e(TAG, "Invalid audio device selection");
+        break;
+    }
+    onAudioManagerChangedState();
+  }
+
+  /** Returns current set of available/selectable audio devices. */
+  public Set<AudioDevice> getAudioDevices() {
+    return Collections.unmodifiableSet(new HashSet<AudioDevice>(audioDevices));
+  }
+
+  /** Returns the currently selected audio device. */
+  public AudioDevice getSelectedAudioDevice() {
+    return selectedAudioDevice;
+  }
+
+  /**
+   * Registers receiver for the broadcasted intent when a wired headset is
+   * plugged in or unplugged. The received intent will have an extra
+   * 'state' value where 0 means unplugged, and 1 means plugged.
+   */
+  private void registerForWiredHeadsetIntentBroadcast() {
+    IntentFilter filter = new IntentFilter(Intent.ACTION_HEADSET_PLUG);
+
+    /** Receiver which handles changes in wired headset availability. */
+    wiredHeadsetReceiver = new BroadcastReceiver() {
+      private static final int STATE_UNPLUGGED = 0;
+      private static final int STATE_PLUGGED = 1;
+      private static final int HAS_NO_MIC = 0;
+      private static final int HAS_MIC = 1;
+
+      @Override
+      public void onReceive(Context context, Intent intent) {
+        int state = intent.getIntExtra("state", STATE_UNPLUGGED);
+        int microphone = intent.getIntExtra("microphone", HAS_NO_MIC);
+        String name = intent.getStringExtra("name");
+        Log.d(TAG, "BroadcastReceiver.onReceive" + AppRTCUtils.getThreadInfo()
+            + ": "
+            + "a=" + intent.getAction()
+            + ", s=" + (state == STATE_UNPLUGGED ? "unplugged" : "plugged")
+            + ", m=" + (microphone == HAS_MIC ? "mic" : "no mic")
+            + ", n=" + name
+            + ", sb=" + isInitialStickyBroadcast());
+
+        boolean hasWiredHeadset = (state == STATE_PLUGGED) ? true : false;
+        switch (state) {
+          case STATE_UNPLUGGED:
+            updateAudioDeviceState(hasWiredHeadset);
+            break;
+          case STATE_PLUGGED:
+            if (selectedAudioDevice != AudioDevice.WIRED_HEADSET) {
+              updateAudioDeviceState(hasWiredHeadset);
+            }
+            break;
+          default:
+            Log.e(TAG, "Invalid state");
+            break;
+        }
+      }
+    };
+
+    apprtcContext.registerReceiver(wiredHeadsetReceiver, filter);
+  }
+
+  /** Unregister receiver for broadcasted ACTION_HEADSET_PLUG intent. */
+  private void unregisterForWiredHeadsetIntentBroadcast() {
+    apprtcContext.unregisterReceiver(wiredHeadsetReceiver);
+    wiredHeadsetReceiver = null;
+  }
+
+  /** Sets the speaker phone mode. */
+  private void setSpeakerphoneOn(boolean on) {
+    boolean wasOn = audioManager.isSpeakerphoneOn();
+    if (wasOn == on) {
+      return;
+    }
+    audioManager.setSpeakerphoneOn(on);
+  }
+
+  /** Sets the microphone mute state. */
+  private void setMicrophoneMute(boolean on) {
+    boolean wasMuted = audioManager.isMicrophoneMute();
+    if (wasMuted == on) {
+      return;
+    }
+    audioManager.setMicrophoneMute(on);
+  }
+
+  /** Gets the current earpiece state. */
+  private boolean hasEarpiece() {
+    return apprtcContext.getPackageManager().hasSystemFeature(
+        PackageManager.FEATURE_TELEPHONY);
+  }
+
+  /**
+   * Checks whether a wired headset is connected or not.
+   * This is not a valid indication that audio playback is actually over
+   * the wired headset as audio routing depends on other conditions. We
+   * only use it as an early indicator (during initialization) of an attached
+   * wired headset.
+   */
+  @Deprecated
+  private boolean hasWiredHeadset() {
+    return audioManager.isWiredHeadsetOn();
+  }
+
+  /** Update list of possible audio devices and make new device selection. */
+  private void updateAudioDeviceState(boolean hasWiredHeadset) {
+    // Update the list of available audio devices.
+    audioDevices.clear();
+    if (hasWiredHeadset) {
+      // If a wired headset is connected, then it is the only possible option.
+      audioDevices.add(AudioDevice.WIRED_HEADSET);
+    } else {
+      // No wired headset, hence the audio-device list can contain speaker
+      // phone (on a tablet), or speaker phone and earpiece (on mobile phone).
+      audioDevices.add(AudioDevice.SPEAKER_PHONE);
+      if (hasEarpiece())  {
+        audioDevices.add(AudioDevice.EARPIECE);
+      }
+    }
+    Log.d(TAG, "audioDevices: " + audioDevices);
+
+    // Switch to correct audio device given the list of available audio devices.
+    if (hasWiredHeadset) {
+      setAudioDevice(AudioDevice.WIRED_HEADSET);
+    } else {
+      setAudioDevice(defaultAudioDevice);
+    }
+  }
+
+  /** Called each time a new audio device has been added or removed. */
+  private void onAudioManagerChangedState() {
+    Log.d(TAG, "onAudioManagerChangedState: devices=" + audioDevices
+        + ", selected=" + selectedAudioDevice);
+
+    // Enable the proximity sensor if there are two available audio devices
+    // in the list. Given the current implementation, we know that the choice
+    // will then be between EARPIECE and SPEAKER_PHONE.
+    if (audioDevices.size() == 2) {
+      AppRTCUtils.assertIsTrue(audioDevices.contains(AudioDevice.EARPIECE)
+          && audioDevices.contains(AudioDevice.SPEAKER_PHONE));
+      // Start the proximity sensor.
+      proximitySensor.start();
+    } else if (audioDevices.size() == 1) {
+      // Stop the proximity sensor since it is no longer needed.
+      proximitySensor.stop();
+    } else {
+      Log.e(TAG, "Invalid device list");
+    }
+
+    if (onStateChangeListener != null) {
+      // Run callback to notify a listening client. The client can then
+      // use public getters to query the new state.
+      onStateChangeListener.run();
+    }
+  }
+}
diff --git a/examples/androidapp/src/org/appspot/apprtc/AppRTCClient.java b/examples/androidapp/src/org/appspot/apprtc/AppRTCClient.java
new file mode 100644
index 0000000..195446a
--- /dev/null
+++ b/examples/androidapp/src/org/appspot/apprtc/AppRTCClient.java
@@ -0,0 +1,125 @@
+/*
+ *  Copyright 2013 The WebRTC Project Authors. All rights reserved.
+ *
+ *  Use of this source code is governed by a BSD-style license
+ *  that can be found in the LICENSE file in the root of the source
+ *  tree. An additional intellectual property rights grant can be found
+ *  in the file PATENTS.  All contributing project authors may
+ *  be found in the AUTHORS file in the root of the source tree.
+ */
+
+package org.appspot.apprtc;
+
+import org.webrtc.IceCandidate;
+import org.webrtc.PeerConnection;
+import org.webrtc.SessionDescription;
+
+import java.util.List;
+
+/**
+ * AppRTCClient is the interface representing an AppRTC client.
+ */
+public interface AppRTCClient {
+
+  /**
+   * Struct holding the connection parameters of an AppRTC room.
+   */
+  public static class RoomConnectionParameters {
+    public final String roomUrl;
+    public final String roomId;
+    public final boolean loopback;
+    public RoomConnectionParameters(
+        String roomUrl, String roomId, boolean loopback) {
+      this.roomUrl = roomUrl;
+      this.roomId = roomId;
+      this.loopback = loopback;
+    }
+  }
+
+  /**
+   * Asynchronously connect to an AppRTC room URL using supplied connection
+   * parameters. Once connection is established onConnectedToRoom()
+   * callback with room parameters is invoked.
+   */
+  public void connectToRoom(RoomConnectionParameters connectionParameters);
+
+  /**
+   * Send offer SDP to the other participant.
+   */
+  public void sendOfferSdp(final SessionDescription sdp);
+
+  /**
+   * Send answer SDP to the other participant.
+   */
+  public void sendAnswerSdp(final SessionDescription sdp);
+
+  /**
+   * Send Ice candidate to the other participant.
+   */
+  public void sendLocalIceCandidate(final IceCandidate candidate);
+
+  /**
+   * Disconnect from room.
+   */
+  public void disconnectFromRoom();
+
+  /**
+   * Struct holding the signaling parameters of an AppRTC room.
+   */
+  public static class SignalingParameters {
+    public final List<PeerConnection.IceServer> iceServers;
+    public final boolean initiator;
+    public final String clientId;
+    public final String wssUrl;
+    public final String wssPostUrl;
+    public final SessionDescription offerSdp;
+    public final List<IceCandidate> iceCandidates;
+
+    public SignalingParameters(
+        List<PeerConnection.IceServer> iceServers,
+        boolean initiator, String clientId,
+        String wssUrl, String wssPostUrl,
+        SessionDescription offerSdp, List<IceCandidate> iceCandidates) {
+      this.iceServers = iceServers;
+      this.initiator = initiator;
+      this.clientId = clientId;
+      this.wssUrl = wssUrl;
+      this.wssPostUrl = wssPostUrl;
+      this.offerSdp = offerSdp;
+      this.iceCandidates = iceCandidates;
+    }
+  }
+
+  /**
+   * Callback interface for messages delivered on signaling channel.
+   *
+   * <p>Methods are guaranteed to be invoked on the UI thread of |activity|.
+   */
+  public static interface SignalingEvents {
+    /**
+     * Callback fired once the room's signaling parameters
+     * SignalingParameters are extracted.
+     */
+    public void onConnectedToRoom(final SignalingParameters params);
+
+    /**
+     * Callback fired once remote SDP is received.
+     */
+    public void onRemoteDescription(final SessionDescription sdp);
+
+    /**
+     * Callback fired once remote Ice candidate is received.
+     */
+    public void onRemoteIceCandidate(final IceCandidate candidate);
+
+    /**
+     * Callback fired once channel is closed.
+     */
+    public void onChannelClose();
+
+    /**
+     * Callback fired once channel error happened.
+     */
+    public void onChannelError(final String description);
+  }
+}
diff --git a/examples/androidapp/src/org/appspot/apprtc/AppRTCProximitySensor.java b/examples/androidapp/src/org/appspot/apprtc/AppRTCProximitySensor.java
new file mode 100644
index 0000000..08d9691
--- /dev/null
+++ b/examples/androidapp/src/org/appspot/apprtc/AppRTCProximitySensor.java
@@ -0,0 +1,180 @@
+/*
+ *  Copyright 2014 The WebRTC Project Authors. All rights reserved.
+ *
+ *  Use of this source code is governed by a BSD-style license
+ *  that can be found in the LICENSE file in the root of the source
+ *  tree. An additional intellectual property rights grant can be found
+ *  in the file PATENTS.  All contributing project authors may
+ *  be found in the AUTHORS file in the root of the source tree.
+ */
+
+package org.appspot.apprtc;
+
+import org.appspot.apprtc.util.AppRTCUtils;
+import org.appspot.apprtc.util.AppRTCUtils.NonThreadSafe;
+
+import android.content.Context;
+import android.hardware.Sensor;
+import android.hardware.SensorEvent;
+import android.hardware.SensorEventListener;
+import android.hardware.SensorManager;
+import android.os.Build;
+import android.util.Log;
+
+/**
+ * AppRTCProximitySensor manages functions related to the proximity sensor in
+ * the AppRTC demo.
+ * On most device, the proximity sensor is implemented as a boolean-sensor.
+ * It returns just two values "NEAR" or "FAR". Thresholding is done on the LUX
+ * value i.e. the LUX value of the light sensor is compared with a threshold.
+ * A LUX-value more than the threshold means the proximity sensor returns "FAR".
+ * Anything less than the threshold value and the sensor  returns "NEAR".
+ */
+public class AppRTCProximitySensor implements SensorEventListener {
+  private static final String TAG = "AppRTCProximitySensor";
+
+  // This class should be created, started and stopped on one thread
+  // (e.g. the main thread). We use |nonThreadSafe| to ensure that this is
+  // the case. Only active when |DEBUG| is set to true.
+  private final NonThreadSafe nonThreadSafe = new AppRTCUtils.NonThreadSafe();
+
+  private final Runnable onSensorStateListener;
+  private final SensorManager sensorManager;
+  private Sensor proximitySensor = null;
+  private boolean lastStateReportIsNear = false;
+
+  /** Construction */
+  static AppRTCProximitySensor create(Context context,
+      Runnable sensorStateListener) {
+    return new AppRTCProximitySensor(context, sensorStateListener);
+  }
+
+  private AppRTCProximitySensor(Context context, Runnable sensorStateListener) {
+    Log.d(TAG, "AppRTCProximitySensor" + AppRTCUtils.getThreadInfo());
+    onSensorStateListener = sensorStateListener;
+    sensorManager = ((SensorManager) context.getSystemService(
+        Context.SENSOR_SERVICE));
+  }
+
+  /**
+   * Activate the proximity sensor. Also do initializtion if called for the
+   * first time.
+   */
+  public boolean start() {
+    checkIfCalledOnValidThread();
+    Log.d(TAG, "start" + AppRTCUtils.getThreadInfo());
+    if (!initDefaultSensor()) {
+      // Proximity sensor is not supported on this device.
+      return false;
+    }
+    sensorManager.registerListener(
+        this, proximitySensor, SensorManager.SENSOR_DELAY_NORMAL);
+    return true;
+  }
+
+  /** Deactivate the proximity sensor. */
+  public void stop() {
+    checkIfCalledOnValidThread();
+    Log.d(TAG, "stop" + AppRTCUtils.getThreadInfo());
+    if (proximitySensor == null) {
+      return;
+    }
+    sensorManager.unregisterListener(this, proximitySensor);
+  }
+
+  /** Getter for last reported state. Set to true if "near" is reported. */
+  public boolean sensorReportsNearState() {
+    checkIfCalledOnValidThread();
+    return lastStateReportIsNear;
+  }
+
+  @Override
+  public final void onAccuracyChanged(Sensor sensor, int accuracy) {
+    checkIfCalledOnValidThread();
+    AppRTCUtils.assertIsTrue(sensor.getType() == Sensor.TYPE_PROXIMITY);
+    if (accuracy == SensorManager.SENSOR_STATUS_UNRELIABLE) {
+      Log.e(TAG, "The values returned by this sensor cannot be trusted");
+    }
+  }
+
+  @Override
+  public final void onSensorChanged(SensorEvent event) {
+    checkIfCalledOnValidThread();
+    AppRTCUtils.assertIsTrue(event.sensor.getType() == Sensor.TYPE_PROXIMITY);
+    // As a best practice; do as little as possible within this method and
+    // avoid blocking.
+    float distanceInCentimeters = event.values[0];
+    if (distanceInCentimeters < proximitySensor.getMaximumRange()) {
+      Log.d(TAG, "Proximity sensor => NEAR state");
+      lastStateReportIsNear = true;
+    } else {
+      Log.d(TAG, "Proximity sensor => FAR state");
+      lastStateReportIsNear = false;
+    }
+
+    // Report about new state to listening client. Client can then call
+    // sensorReportsNearState() to query the current state (NEAR or FAR).
+    if (onSensorStateListener != null) {
+      onSensorStateListener.run();
+    }
+
+    Log.d(TAG, "onSensorChanged" + AppRTCUtils.getThreadInfo() + ": "
+        + "accuracy=" + event.accuracy
+        + ", timestamp=" + event.timestamp + ", distance=" + event.values[0]);
+  }
+
+  /**
+   * Get default proximity sensor if it exists. Tablet devices (e.g. Nexus 7)
+   * does not support this type of sensor and false will be retured in such
+   * cases.
+   */
+  private boolean initDefaultSensor() {
+    if (proximitySensor != null) {
+      return true;
+    }
+    proximitySensor = sensorManager.getDefaultSensor(Sensor.TYPE_PROXIMITY);
+    if (proximitySensor == null) {
+      return false;
+    }
+    logProximitySensorInfo();
+    return true;
+  }
+
+  /** Helper method for logging information about the proximity sensor. */
+  private void logProximitySensorInfo() {
+    if (proximitySensor == null) {
+      return;
+    }
+    StringBuilder info = new StringBuilder("Proximity sensor: ");
+    info.append("name=" + proximitySensor.getName());
+    info.append(", vendor: " + proximitySensor.getVendor());
+    info.append(", power: " + proximitySensor.getPower());
+    info.append(", resolution: " + proximitySensor.getResolution());
+    info.append(", max range: " + proximitySensor.getMaximumRange());
+    if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.GINGERBREAD) {
+      // Added in API level 9.
+      info.append(", min delay: " + proximitySensor.getMinDelay());
+    }
+    if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.KITKAT_WATCH) {
+      // Added in API level 20.
+      info.append(", type: " + proximitySensor.getStringType());
+    }
+    if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.LOLLIPOP) {
+      // Added in API level 21.
+      info.append(", max delay: " + proximitySensor.getMaxDelay());
+      info.append(", reporting mode: " + proximitySensor.getReportingMode());
+      info.append(", isWakeUpSensor: " + proximitySensor.isWakeUpSensor());
+    }
+    Log.d(TAG, info.toString());
+  }
+
+  /**
+   * Helper method for debugging purposes. Ensures that method is
+   * called on same thread as this object was created on.
+   */
+  private void checkIfCalledOnValidThread() {
+    if (!nonThreadSafe.calledOnValidThread()) {
+      throw new IllegalStateException("Method is not called on valid thread");
+    }
+  }
+}
diff --git a/examples/androidapp/src/org/appspot/apprtc/CallActivity.java b/examples/androidapp/src/org/appspot/apprtc/CallActivity.java
new file mode 100644
index 0000000..d46f9a3
--- /dev/null
+++ b/examples/androidapp/src/org/appspot/apprtc/CallActivity.java
@@ -0,0 +1,647 @@
+/*
+ *  Copyright 2015 The WebRTC Project Authors. All rights reserved.
+ *
+ *  Use of this source code is governed by a BSD-style license
+ *  that can be found in the LICENSE file in the root of the source
+ *  tree. An additional intellectual property rights grant can be found
+ *  in the file PATENTS.  All contributing project authors may
+ *  be found in the AUTHORS file in the root of the source tree.
+ */
+
+package org.appspot.apprtc;
+
+import org.appspot.apprtc.AppRTCClient.RoomConnectionParameters;
+import org.appspot.apprtc.AppRTCClient.SignalingParameters;
+import org.appspot.apprtc.PeerConnectionClient.PeerConnectionParameters;
+import org.appspot.apprtc.util.LooperExecutor;
+
+import android.app.Activity;
+import android.app.AlertDialog;
+import android.app.FragmentTransaction;
+import android.content.DialogInterface;
+import android.content.Intent;
+import android.content.pm.PackageManager;
+import android.net.Uri;
+import android.opengl.GLSurfaceView;
+import android.os.Bundle;
+import android.util.Log;
+import android.view.View;
+import android.view.Window;
+import android.view.WindowManager.LayoutParams;
+import android.widget.Toast;
+
+import org.webrtc.IceCandidate;
+import org.webrtc.SessionDescription;
+import org.webrtc.StatsReport;
+import org.webrtc.VideoRenderer;
+import org.webrtc.VideoRendererGui;
+import org.webrtc.VideoRendererGui.ScalingType;
+
+/**
+ * Activity for peer connection call setup, call waiting
+ * and call view.
+ */
+public class CallActivity extends Activity
+    implements AppRTCClient.SignalingEvents,
+      PeerConnectionClient.PeerConnectionEvents,
+      CallFragment.OnCallEvents {
+
+  public static final String EXTRA_ROOMID =
+      "org.appspot.apprtc.ROOMID";
+  public static final String EXTRA_LOOPBACK =
+      "org.appspot.apprtc.LOOPBACK";
+  public static final String EXTRA_VIDEO_CALL =
+      "org.appspot.apprtc.VIDEO_CALL";
+  public static final String EXTRA_VIDEO_WIDTH =
+      "org.appspot.apprtc.VIDEO_WIDTH";
+  public static final String EXTRA_VIDEO_HEIGHT =
+      "org.appspot.apprtc.VIDEO_HEIGHT";
+  public static final String EXTRA_VIDEO_FPS =
+      "org.appspot.apprtc.VIDEO_FPS";
+  public static final String EXTRA_VIDEO_BITRATE =
+      "org.appspot.apprtc.VIDEO_BITRATE";
+  public static final String EXTRA_VIDEOCODEC =
+      "org.appspot.apprtc.VIDEOCODEC";
+  public static final String EXTRA_HWCODEC_ENABLED =
+      "org.appspot.apprtc.HWCODEC";
+  public static final String EXTRA_AUDIO_BITRATE =
+      "org.appspot.apprtc.AUDIO_BITRATE";
+  public static final String EXTRA_AUDIOCODEC =
+      "org.appspot.apprtc.AUDIOCODEC";
+  public static final String EXTRA_NOAUDIOPROCESSING_ENABLED =
+      "org.appspot.apprtc.NOAUDIOPROCESSING";
+  public static final String EXTRA_CPUOVERUSE_DETECTION =
+      "org.appspot.apprtc.CPUOVERUSE_DETECTION";
+  public static final String EXTRA_DISPLAY_HUD =
+      "org.appspot.apprtc.DISPLAY_HUD";
+  public static final String EXTRA_CMDLINE =
+      "org.appspot.apprtc.CMDLINE";
+  public static final String EXTRA_RUNTIME =
+      "org.appspot.apprtc.RUNTIME";
+  private static final String TAG = "CallRTCClient";
+
+  // List of mandatory application permissions.
+  private static final String[] MANDATORY_PERMISSIONS = {
+    "android.permission.MODIFY_AUDIO_SETTINGS",
+    "android.permission.RECORD_AUDIO",
+    "android.permission.INTERNET"
+  };
+
+  // Peer connection statistics callback period in ms.
+  private static final int STAT_CALLBACK_PERIOD = 1000;
+  // Local preview screen position before call is connected.
+  private static final int LOCAL_X_CONNECTING = 0;
+  private static final int LOCAL_Y_CONNECTING = 0;
+  private static final int LOCAL_WIDTH_CONNECTING = 100;
+  private static final int LOCAL_HEIGHT_CONNECTING = 100;
+  // Local preview screen position after call is connected.
+  private static final int LOCAL_X_CONNECTED = 72;
+  private static final int LOCAL_Y_CONNECTED = 72;
+  private static final int LOCAL_WIDTH_CONNECTED = 25;
+  private static final int LOCAL_HEIGHT_CONNECTED = 25;
+  // Remote video screen position
+  private static final int REMOTE_X = 0;
+  private static final int REMOTE_Y = 0;
+  private static final int REMOTE_WIDTH = 100;
+  private static final int REMOTE_HEIGHT = 100;
+
+  private PeerConnectionClient peerConnectionClient = null;
+  private AppRTCClient appRtcClient;
+  private SignalingParameters signalingParameters;
+  private AppRTCAudioManager audioManager = null;
+  private VideoRenderer.Callbacks localRender;
+  private VideoRenderer.Callbacks remoteRender;
+  private ScalingType scalingType;
+  private Toast logToast;
+  private boolean commandLineRun;
+  private int runTimeMs;
+  private boolean activityRunning;
+  private RoomConnectionParameters roomConnectionParameters;
+  private PeerConnectionParameters peerConnectionParameters;
+  private boolean iceConnected;
+  private boolean isError;
+  private boolean callControlFragmentVisible = true;
+  private long callStartedTimeMs = 0;
+
+  // Controls
+  private GLSurfaceView videoView;
+  CallFragment callFragment;
+  HudFragment hudFragment;
+
+  @Override
+  public void onCreate(Bundle savedInstanceState) {
+    super.onCreate(savedInstanceState);
+    Thread.setDefaultUncaughtExceptionHandler(
+        new UnhandledExceptionHandler(this));
+
+    // Set window styles for fullscreen-window size. Needs to be done before
+    // adding content.
+    requestWindowFeature(Window.FEATURE_NO_TITLE);
+    getWindow().addFlags(
+        LayoutParams.FLAG_FULLSCREEN
+        | LayoutParams.FLAG_KEEP_SCREEN_ON
+        | LayoutParams.FLAG_DISMISS_KEYGUARD
+        | LayoutParams.FLAG_SHOW_WHEN_LOCKED
+        | LayoutParams.FLAG_TURN_SCREEN_ON);
+    getWindow().getDecorView().setSystemUiVisibility(
+        View.SYSTEM_UI_FLAG_HIDE_NAVIGATION
+        | View.SYSTEM_UI_FLAG_FULLSCREEN
+        | View.SYSTEM_UI_FLAG_IMMERSIVE_STICKY);
+    setContentView(R.layout.activity_call);
+
+    iceConnected = false;
+    signalingParameters = null;
+    scalingType = ScalingType.SCALE_ASPECT_FILL;
+
+    // Create UI controls.
+    videoView = (GLSurfaceView) findViewById(R.id.glview_call);
+    callFragment = new CallFragment();
+    hudFragment = new HudFragment();
+
+    // Create video renderers.
+    VideoRendererGui.setView(videoView, new Runnable() {
+      @Override
+      public void run() {
+        createPeerConnectionFactory();
+      }
+    });
+    remoteRender = VideoRendererGui.create(
+        REMOTE_X, REMOTE_Y,
+        REMOTE_WIDTH, REMOTE_HEIGHT, scalingType, false);
+    localRender = VideoRendererGui.create(
+        LOCAL_X_CONNECTING, LOCAL_Y_CONNECTING,
+        LOCAL_WIDTH_CONNECTING, LOCAL_HEIGHT_CONNECTING, scalingType, true);
+
+    // Show/hide call control fragment on view click.
+    videoView.setOnClickListener(new View.OnClickListener() {
+      @Override
+      public void onClick(View view) {
+        toggleCallControlFragmentVisibility();
+      }
+    });
+
+    // Check for mandatory permissions.
+    for (String permission : MANDATORY_PERMISSIONS) {
+      if (checkCallingOrSelfPermission(permission) != PackageManager.PERMISSION_GRANTED) {
+        logAndToast("Permission " + permission + " is not granted");
+        setResult(RESULT_CANCELED);
+        finish();
+        return;
+      }
+    }
+
+    // Get Intent parameters.
+    final Intent intent = getIntent();
+    Uri roomUri = intent.getData();
+    if (roomUri == null) {
+      logAndToast(getString(R.string.missing_url));
+      Log.e(TAG, "Didn't get any URL in intent!");
+      setResult(RESULT_CANCELED);
+      finish();
+      return;
+    }
+    String roomId = intent.getStringExtra(EXTRA_ROOMID);
+    if (roomId == null || roomId.length() == 0) {
+      logAndToast(getString(R.string.missing_url));
+      Log.e(TAG, "Incorrect room ID in intent!");
+      setResult(RESULT_CANCELED);
+      finish();
+      return;
+    }
+    boolean loopback = intent.getBooleanExtra(EXTRA_LOOPBACK, false);
+    peerConnectionParameters = new PeerConnectionParameters(
+        intent.getBooleanExtra(EXTRA_VIDEO_CALL, true),
+        loopback,
+        intent.getIntExtra(EXTRA_VIDEO_WIDTH, 0),
+        intent.getIntExtra(EXTRA_VIDEO_HEIGHT, 0),
+        intent.getIntExtra(EXTRA_VIDEO_FPS, 0),
+        intent.getIntExtra(EXTRA_VIDEO_BITRATE, 0),
+        intent.getStringExtra(EXTRA_VIDEOCODEC),
+        intent.getBooleanExtra(EXTRA_HWCODEC_ENABLED, true),
+        intent.getIntExtra(EXTRA_AUDIO_BITRATE, 0),
+        intent.getStringExtra(EXTRA_AUDIOCODEC),
+        intent.getBooleanExtra(EXTRA_NOAUDIOPROCESSING_ENABLED, false),
+        intent.getBooleanExtra(EXTRA_CPUOVERUSE_DETECTION, true));
+    commandLineRun = intent.getBooleanExtra(EXTRA_CMDLINE, false);
+    runTimeMs = intent.getIntExtra(EXTRA_RUNTIME, 0);
+
+    // Create connection client and connection parameters.
+    appRtcClient = new WebSocketRTCClient(this, new LooperExecutor());
+    roomConnectionParameters = new RoomConnectionParameters(
+        roomUri.toString(), roomId, loopback);
+
+    // Send intent arguments to fragments.
+    callFragment.setArguments(intent.getExtras());
+    hudFragment.setArguments(intent.getExtras());
+    // Activate call and HUD fragments and start the call.
+    FragmentTransaction ft = getFragmentManager().beginTransaction();
+    ft.add(R.id.call_fragment_container, callFragment);
+    ft.add(R.id.hud_fragment_container, hudFragment);
+    ft.commit();
+    startCall();
+
+    // For command line execution run connection for <runTimeMs> and exit.
+    if (commandLineRun && runTimeMs > 0) {
+      videoView.postDelayed(new Runnable() {
+        public void run() {
+          disconnect();
+        }
+      }, runTimeMs);
+    }
+  }
+
+  // Activity interfaces
+  @Override
+  public void onPause() {
+    super.onPause();
+    videoView.onPause();
+    activityRunning = false;
+    if (peerConnectionClient != null) {
+      peerConnectionClient.stopVideoSource();
+    }
+  }
+
+  @Override
+  public void onResume() {
+    super.onResume();
+    videoView.onResume();
+    activityRunning = true;
+    if (peerConnectionClient != null) {
+      peerConnectionClient.startVideoSource();
+    }
+  }
+
+  @Override
+  protected void onDestroy() {
+    disconnect();
+    super.onDestroy();
+    if (logToast != null) {
+      logToast.cancel();
+    }
+    activityRunning = false;
+  }
+
+  // CallFragment.OnCallEvents interface implementation.
+  @Override
+  public void onCallHangUp() {
+    disconnect();
+  }
+
+  @Override
+  public void onCameraSwitch() {
+    if (peerConnectionClient != null) {
+      peerConnectionClient.switchCamera();
+    }
+  }
+
+  @Override
+  public void onVideoScalingSwitch(ScalingType scalingType) {
+    this.scalingType = scalingType;
+    updateVideoView();
+  }
+
+  // Helper functions.
+  private void toggleCallControlFragmentVisibility() {
+    if (!iceConnected || !callFragment.isAdded()) {
+      return;
+    }
+    // Show/hide call control fragment
+    callControlFragmentVisible = !callControlFragmentVisible;
+    FragmentTransaction ft = getFragmentManager().beginTransaction();
+    if (callControlFragmentVisible) {
+      ft.show(callFragment);
+      ft.show(hudFragment);
+    } else {
+      ft.hide(callFragment);
+      ft.hide(hudFragment);
+    }
+    ft.setTransition(FragmentTransaction.TRANSIT_FRAGMENT_FADE);
+    ft.commit();
+  }
+
+  private void updateVideoView() {
+    VideoRendererGui.update(remoteRender,
+        REMOTE_X, REMOTE_Y,
+        REMOTE_WIDTH, REMOTE_HEIGHT, scalingType, false);
+    if (iceConnected) {
+      VideoRendererGui.update(localRender,
+          LOCAL_X_CONNECTED, LOCAL_Y_CONNECTED,
+          LOCAL_WIDTH_CONNECTED, LOCAL_HEIGHT_CONNECTED,
+          ScalingType.SCALE_ASPECT_FIT, true);
+    } else {
+      VideoRendererGui.update(localRender,
+          LOCAL_X_CONNECTING, LOCAL_Y_CONNECTING,
+          LOCAL_WIDTH_CONNECTING, LOCAL_HEIGHT_CONNECTING, scalingType, true);
+    }
+  }
+
+  private void startCall() {
+    if (appRtcClient == null) {
+      Log.e(TAG, "AppRTC client is not allocated for a call.");
+      return;
+    }
+    callStartedTimeMs = System.currentTimeMillis();
+
+    // Start room connection.
+    logAndToast(getString(R.string.connecting_to,
+        roomConnectionParameters.roomUrl));
+    appRtcClient.connectToRoom(roomConnectionParameters);
+
+    // Create and audio manager that will take care of audio routing,
+    // audio modes, audio device enumeration etc.
+    audioManager = AppRTCAudioManager.create(this, new Runnable() {
+        // This method will be called each time the audio state (number and
+        // type of devices) has been changed.
+        @Override
+        public void run() {
+          onAudioManagerChangedState();
+        }
+      }
+    );
+    // Store existing audio settings and change audio mode to
+    // MODE_IN_COMMUNICATION for best possible VoIP performance.
+    Log.d(TAG, "Initializing the audio manager...");
+    audioManager.init();
+  }
+
+  // Should be called from UI thread
+  private void callConnected() {
+    final long delta = System.currentTimeMillis() - callStartedTimeMs;
+    Log.i(TAG, "Call connected: delay=" + delta + "ms");
+
+    // Update video view.
+    updateVideoView();
+    // Enable statistics callback.
+    peerConnectionClient.enableStatsEvents(true, STAT_CALLBACK_PERIOD);
+  }
+
+  private void onAudioManagerChangedState() {
+    // TODO(henrika): disable video if AppRTCAudioManager.AudioDevice.EARPIECE
+    // is active.
+  }
+
+  // Create peer connection factory when EGL context is ready.
+  private void createPeerConnectionFactory() {
+    runOnUiThread(new Runnable() {
+      @Override
+      public void run() {
+        if (peerConnectionClient == null) {
+          final long delta = System.currentTimeMillis() - callStartedTimeMs;
+          Log.d(TAG, "Creating peer connection factory, delay=" + delta + "ms");
+          peerConnectionClient = PeerConnectionClient.getInstance();
+          peerConnectionClient.createPeerConnectionFactory(CallActivity.this,
+              VideoRendererGui.getEGLContext(), peerConnectionParameters,
+              CallActivity.this);
+        }
+        if (signalingParameters != null) {
+          Log.w(TAG, "EGL context is ready after room connection.");
+          onConnectedToRoomInternal(signalingParameters);
+        }
+      }
+    });
+  }
+
+  // Disconnect from remote resources, dispose of local resources, and exit.
+  private void disconnect() {
+    activityRunning = false;
+    if (appRtcClient != null) {
+      appRtcClient.disconnectFromRoom();
+      appRtcClient = null;
+    }
+    if (peerConnectionClient != null) {
+      peerConnectionClient.close();
+      peerConnectionClient = null;
+    }
+    if (audioManager != null) {
+      audioManager.close();
+      audioManager = null;
+    }
+    if (iceConnected && !isError) {
+      setResult(RESULT_OK);
+    } else {
+      setResult(RESULT_CANCELED);
+    }
+    finish();
+  }
+
+  private void disconnectWithErrorMessage(final String errorMessage) {
+    if (commandLineRun || !activityRunning) {
+      Log.e(TAG, "Critical error: " + errorMessage);
+      disconnect();
+    } else {
+      new AlertDialog.Builder(this)
+          .setTitle(getText(R.string.channel_error_title))
+          .setMessage(errorMessage)
+          .setCancelable(false)
+          .setNeutralButton(R.string.ok, new DialogInterface.OnClickListener() {
+            @Override
+            public void onClick(DialogInterface dialog, int id) {
+              dialog.cancel();
+              disconnect();
+            }
+          }).create().show();
+    }
+  }
+
+  // Log |msg| and Toast about it.
+  private void logAndToast(String msg) {
+    Log.d(TAG, msg);
+    if (logToast != null) {
+      logToast.cancel();
+    }
+    logToast = Toast.makeText(this, msg, Toast.LENGTH_SHORT);
+    logToast.show();
+  }
+
+  private void reportError(final String description) {
+    runOnUiThread(new Runnable() {
+      @Override
+      public void run() {
+        if (!isError) {
+          isError = true;
+          disconnectWithErrorMessage(description);
+        }
+      }
+    });
+  }
+
+  // -----Implementation of AppRTCClient.AppRTCSignalingEvents ---------------
+  // All callbacks are invoked from websocket signaling looper thread and
+  // are routed to UI thread.
+  private void onConnectedToRoomInternal(final SignalingParameters params) {
+    final long delta = System.currentTimeMillis() - callStartedTimeMs;
+
+    signalingParameters = params;
+    if (peerConnectionClient == null) {
+      Log.w(TAG, "Room is connected, but EGL context is not ready yet.");
+      return;
+    }
+    logAndToast("Creating peer connection, delay=" + delta + "ms");
+    peerConnectionClient.createPeerConnection(
+        localRender, remoteRender, signalingParameters);
+
+    if (signalingParameters.initiator) {
+      logAndToast("Creating OFFER...");
+      // Create offer. Offer SDP will be sent to answering client in
+      // PeerConnectionEvents.onLocalDescription event.
+      peerConnectionClient.createOffer();
+    } else {
+      if (params.offerSdp != null) {
+        peerConnectionClient.setRemoteDescription(params.offerSdp);
+        logAndToast("Creating ANSWER...");
+        // Create answer. Answer SDP will be sent to offering client in
+        // PeerConnectionEvents.onLocalDescription event.
+        peerConnectionClient.createAnswer();
+      }
+      if (params.iceCandidates != null) {
+        // Add remote ICE candidates from room.
+        for (IceCandidate iceCandidate : params.iceCandidates) {
+          peerConnectionClient.addRemoteIceCandidate(iceCandidate);
+        }
+      }
+    }
+  }
+
+  @Override
+  public void onConnectedToRoom(final SignalingParameters params) {
+    runOnUiThread(new Runnable() {
+      @Override
+      public void run() {
+        onConnectedToRoomInternal(params);
+      }
+    });
+  }
+
+  @Override
+  public void onRemoteDescription(final SessionDescription sdp) {
+    final long delta = System.currentTimeMillis() - callStartedTimeMs;
+    runOnUiThread(new Runnable() {
+      @Override
+      public void run() {
+        if (peerConnectionClient == null) {
+          Log.e(TAG, "Received remote SDP for non-initilized peer connection.");
+          return;
+        }
+        logAndToast("Received remote " + sdp.type + ", delay=" + delta + "ms");
+        peerConnectionClient.setRemoteDescription(sdp);
+        if (!signalingParameters.initiator) {
+          logAndToast("Creating ANSWER...");
+          // Create answer. Answer SDP will be sent to offering client in
+          // PeerConnectionEvents.onLocalDescription event.
+          peerConnectionClient.createAnswer();
+        }
+      }
+    });
+  }
+
+  @Override
+  public void onRemoteIceCandidate(final IceCandidate candidate) {
+    runOnUiThread(new Runnable() {
+      @Override
+      public void run() {
+        if (peerConnectionClient == null) {
+          Log.e(TAG,
+              "Received ICE candidate for non-initilized peer connection.");
+          return;
+        }
+        peerConnectionClient.addRemoteIceCandidate(candidate);
+      }
+    });
+  }
+
+  @Override
+  public void onChannelClose() {
+    runOnUiThread(new Runnable() {
+      @Override
+      public void run() {
+        logAndToast("Remote end hung up; dropping PeerConnection");
+        disconnect();
+      }
+    });
+  }
+
+  @Override
+  public void onChannelError(final String description) {
+    reportError(description);
+  }
+
+  // -----Implementation of PeerConnectionClient.PeerConnectionEvents.---------
+  // Send local peer connection SDP and ICE candidates to remote party.
+  // All callbacks are invoked from peer connection client looper thread and
+  // are routed to UI thread.
+  @Override
+  public void onLocalDescription(final SessionDescription sdp) {
+    final long delta = System.currentTimeMillis() - callStartedTimeMs;
+    runOnUiThread(new Runnable() {
+      @Override
+      public void run() {
+        if (appRtcClient != null) {
+          logAndToast("Sending " + sdp.type + ", delay=" + delta + "ms");
+          if (signalingParameters.initiator) {
+            appRtcClient.sendOfferSdp(sdp);
+          } else {
+            appRtcClient.sendAnswerSdp(sdp);
+          }
+        }
+      }
+    });
+  }
+
+  @Override
+  public void onIceCandidate(final IceCandidate candidate) {
+    runOnUiThread(new Runnable() {
+      @Override
+      public void run() {
+        if (appRtcClient != null) {
+          appRtcClient.sendLocalIceCandidate(candidate);
+        }
+      }
+    });
+  }
+
+  @Override
+  public void onIceConnected() {
+    final long delta = System.currentTimeMillis() - callStartedTimeMs;
+    runOnUiThread(new Runnable() {
+      @Override
+      public void run() {
+        logAndToast("ICE connected, delay=" + delta + "ms");
+        iceConnected = true;
+        callConnected();
+      }
+    });
+  }
+
+  @Override
+  public void onIceDisconnected() {
+    runOnUiThread(new Runnable() {
+      @Override
+      public void run() {
+        logAndToast("ICE disconnected");
+        iceConnected = false;
+        disconnect();
+      }
+    });
+  }
+
+  @Override
+  public void onPeerConnectionClosed() {
+  }
+
+  @Override
+  public void onPeerConnectionStatsReady(final StatsReport[] reports) {
+    runOnUiThread(new Runnable() {
+      @Override
+      public void run() {
+        if (!isError && iceConnected) {
+          hudFragment.updateEncoderStatistics(reports);
+        }
+      }
+    });
+  }
+
+  @Override
+  public void onPeerConnectionError(final String description) {
+    reportError(description);
+  }
+}
diff --git a/examples/androidapp/src/org/appspot/apprtc/CallFragment.java b/examples/androidapp/src/org/appspot/apprtc/CallFragment.java
new file mode 100644
index 0000000..3d445d4
--- /dev/null
+++ b/examples/androidapp/src/org/appspot/apprtc/CallFragment.java
@@ -0,0 +1,118 @@
+/*
+ *  Copyright 2015 The WebRTC Project Authors. All rights reserved.
+ *
+ *  Use of this source code is governed by a BSD-style license
+ *  that can be found in the LICENSE file in the root of the source
+ *  tree. An additional intellectual property rights grant can be found
+ *  in the file PATENTS.  All contributing project authors may
+ *  be found in the AUTHORS file in the root of the source tree.
+ */
+
+package org.appspot.apprtc;
+
+import android.app.Activity;
+import android.app.Fragment;
+import android.os.Bundle;
+import android.view.LayoutInflater;
+import android.view.View;
+import android.view.ViewGroup;
+import android.widget.ImageButton;
+import android.widget.TextView;
+
+import org.webrtc.VideoRendererGui.ScalingType;
+
+/**
+ * Fragment for call control.
+ */
+public class CallFragment extends Fragment {
+  private View controlView;
+  private TextView contactView;
+  private ImageButton disconnectButton;
+  private ImageButton cameraSwitchButton;
+  private ImageButton videoScalingButton;
+  private OnCallEvents callEvents;
+  private ScalingType scalingType;
+  private boolean videoCallEnabled = true;
+
+  /**
+   * Call control interface for container activity.
+   */
+  public interface OnCallEvents {
+    public void onCallHangUp();
+    public void onCameraSwitch();
+    public void onVideoScalingSwitch(ScalingType scalingType);
+  }
+
+  @Override
+  public View onCreateView(LayoutInflater inflater, ViewGroup container,
+      Bundle savedInstanceState) {
+    controlView =
+        inflater.inflate(R.layout.fragment_call, container, false);
+
+    // Create UI controls.
+    contactView =
+        (TextView) controlView.findViewById(R.id.contact_name_call);
+    disconnectButton =
+        (ImageButton) controlView.findViewById(R.id.button_call_disconnect);
+    cameraSwitchButton =
+        (ImageButton) controlView.findViewById(R.id.button_call_switch_camera);
+    videoScalingButton =
+        (ImageButton) controlView.findViewById(R.id.button_call_scaling_mode);
+
+    // Add buttons click events.
+    disconnectButton.setOnClickListener(new View.OnClickListener() {
+      @Override
+      public void onClick(View view) {
+        callEvents.onCallHangUp();
+      }
+    });
+
+    cameraSwitchButton.setOnClickListener(new View.OnClickListener() {
+      @Override
+      public void onClick(View view) {
+        callEvents.onCameraSwitch();
+      }
+    });
+
+    videoScalingButton.setOnClickListener(new View.OnClickListener() {
+      @Override
+      public void onClick(View view) {
+        if (scalingType == ScalingType.SCALE_ASPECT_FILL) {
+          videoScalingButton.setBackgroundResource(
+              R.drawable.ic_action_full_screen);
+          scalingType = ScalingType.SCALE_ASPECT_FIT;
+        } else {
+          videoScalingButton.setBackgroundResource(
+              R.drawable.ic_action_return_from_full_screen);
+          scalingType = ScalingType.SCALE_ASPECT_FILL;
+        }
+        callEvents.onVideoScalingSwitch(scalingType);
+      }
+    });
+    scalingType = ScalingType.SCALE_ASPECT_FILL;
+
+    return controlView;
+  }
+
+  @Override
+  public void onStart() {
+    super.onStart();
+
+    Bundle args = getArguments();
+    if (args != null) {
+      String contactName = args.getString(CallActivity.EXTRA_ROOMID);
+      contactView.setText(contactName);
+      videoCallEnabled = args.getBoolean(CallActivity.EXTRA_VIDEO_CALL, true);
+    }
+    if (!videoCallEnabled) {
+      cameraSwitchButton.setVisibility(View.INVISIBLE);
+    }
+  }
+
+  @Override
+  public void onAttach(Activity activity) {
+    super.onAttach(activity);
+    callEvents = (OnCallEvents) activity;
+  }
+
+}
diff --git a/examples/androidapp/src/org/appspot/apprtc/ConnectActivity.java b/examples/androidapp/src/org/appspot/apprtc/ConnectActivity.java
new file mode 100644
index 0000000..7c00790
--- /dev/null
+++ b/examples/androidapp/src/org/appspot/apprtc/ConnectActivity.java
@@ -0,0 +1,403 @@
+/*
+ *  Copyright 2014 The WebRTC Project Authors. All rights reserved.
+ *
+ *  Use of this source code is governed by a BSD-style license
+ *  that can be found in the LICENSE file in the root of the source
+ *  tree. An additional intellectual property rights grant can be found
+ *  in the file PATENTS.  All contributing project authors may
+ *  be found in the AUTHORS file in the root of the source tree.
+ */
+
+package org.appspot.apprtc;
+
+import android.app.Activity;
+import android.app.AlertDialog;
+import android.content.DialogInterface;
+import android.content.Intent;
+import android.content.SharedPreferences;
+import android.net.Uri;
+import android.os.Bundle;
+import android.preference.PreferenceManager;
+import android.util.Log;
+import android.view.KeyEvent;
+import android.view.Menu;
+import android.view.MenuItem;
+import android.view.View;
+import android.view.View.OnClickListener;
+import android.view.inputmethod.EditorInfo;
+import android.webkit.URLUtil;
+import android.widget.AdapterView;
+import android.widget.ArrayAdapter;
+import android.widget.EditText;
+import android.widget.ImageButton;
+import android.widget.ListView;
+import android.widget.TextView;
+
+import org.json.JSONArray;
+import org.json.JSONException;
+
+import java.util.ArrayList;
+import java.util.Random;
+
+/**
+ * Handles the initial setup where the user selects which room to join.
+ */
+public class ConnectActivity extends Activity {
+  private static final String TAG = "ConnectActivity";
+  private static final int CONNECTION_REQUEST = 1;
+  private static boolean commandLineRun = false;
+
+  private ImageButton addRoomButton;
+  private ImageButton removeRoomButton;
+  private ImageButton connectButton;
+  private ImageButton connectLoopbackButton;
+  private EditText roomEditText;
+  private ListView roomListView;
+  private SharedPreferences sharedPref;
+  private String keyprefVideoCallEnabled;
+  private String keyprefResolution;
+  private String keyprefFps;
+  private String keyprefVideoBitrateType;
+  private String keyprefVideoBitrateValue;
+  private String keyprefVideoCodec;
+  private String keyprefAudioBitrateType;
+  private String keyprefAudioBitrateValue;
+  private String keyprefAudioCodec;
+  private String keyprefHwCodecAcceleration;
+  private String keyprefNoAudioProcessingPipeline;
+  private String keyprefCpuUsageDetection;
+  private String keyprefDisplayHud;
+  private String keyprefRoomServerUrl;
+  private String keyprefRoom;
+  private String keyprefRoomList;
+  private ArrayList<String> roomList;
+  private ArrayAdapter<String> adapter;
+
+  @Override
+  public void onCreate(Bundle savedInstanceState) {
+    super.onCreate(savedInstanceState);
+
+    // Get setting keys.
+    PreferenceManager.setDefaultValues(this, R.xml.preferences, false);
+    sharedPref = PreferenceManager.getDefaultSharedPreferences(this);
+    keyprefVideoCallEnabled = getString(R.string.pref_videocall_key);
+    keyprefResolution = getString(R.string.pref_resolution_key);
+    keyprefFps = getString(R.string.pref_fps_key);
+    keyprefVideoBitrateType = getString(R.string.pref_startvideobitrate_key);
+    keyprefVideoBitrateValue = getString(R.string.pref_startvideobitratevalue_key);
+    keyprefVideoCodec = getString(R.string.pref_videocodec_key);
+    keyprefHwCodecAcceleration = getString(R.string.pref_hwcodec_key);
+    keyprefAudioBitrateType = getString(R.string.pref_startaudiobitrate_key);
+    keyprefAudioBitrateValue = getString(R.string.pref_startaudiobitratevalue_key);
+    keyprefAudioCodec = getString(R.string.pref_audiocodec_key);
+    keyprefNoAudioProcessingPipeline = getString(R.string.pref_noaudioprocessing_key);
+    keyprefCpuUsageDetection = getString(R.string.pref_cpu_usage_detection_key);
+    keyprefDisplayHud = getString(R.string.pref_displayhud_key);
+    keyprefRoomServerUrl = getString(R.string.pref_room_server_url_key);
+    keyprefRoom = getString(R.string.pref_room_key);
+    keyprefRoomList = getString(R.string.pref_room_list_key);
+
+    setContentView(R.layout.activity_connect);
+
+    roomEditText = (EditText) findViewById(R.id.room_edittext);
+    roomEditText.setOnEditorActionListener(
+      new TextView.OnEditorActionListener() {
+        @Override
+        public boolean onEditorAction(
+            TextView textView, int i, KeyEvent keyEvent) {
+          if (i == EditorInfo.IME_ACTION_DONE) {
+            addRoomButton.performClick();
+            return true;
+          }
+          return false;
+        }
+    });
+    roomEditText.requestFocus();
+
+    roomListView = (ListView) findViewById(R.id.room_listview);
+    roomListView.setChoiceMode(ListView.CHOICE_MODE_SINGLE);
+
+    addRoomButton = (ImageButton) findViewById(R.id.add_room_button);
+    addRoomButton.setOnClickListener(addRoomListener);
+    removeRoomButton = (ImageButton) findViewById(R.id.remove_room_button);
+    removeRoomButton.setOnClickListener(removeRoomListener);
+    connectButton = (ImageButton) findViewById(R.id.connect_button);
+    connectButton.setOnClickListener(connectListener);
+    connectLoopbackButton =
+        (ImageButton) findViewById(R.id.connect_loopback_button);
+    connectLoopbackButton.setOnClickListener(connectListener);
+
+    // If an implicit VIEW intent is launching the app, go directly to that URL.
+    final Intent intent = getIntent();
+    if ("android.intent.action.VIEW".equals(intent.getAction())
+        && !commandLineRun) {
+      commandLineRun = true;
+      boolean loopback = intent.getBooleanExtra(
+          CallActivity.EXTRA_LOOPBACK, false);
+      int runTimeMs = intent.getIntExtra(
+          CallActivity.EXTRA_RUNTIME, 0);
+      String room = sharedPref.getString(keyprefRoom, "");
+      roomEditText.setText(room);
+      connectToRoom(loopback, runTimeMs);
+      return;
+    }
+  }
+
+  @Override
+  public boolean onCreateOptionsMenu(Menu menu) {
+    getMenuInflater().inflate(R.menu.connect_menu, menu);
+    return true;
+  }
+
+  @Override
+  public boolean onOptionsItemSelected(MenuItem item) {
+    // Handle presses on the action bar items.
+    if (item.getItemId() == R.id.action_settings) {
+      Intent intent = new Intent(this, SettingsActivity.class);
+      startActivity(intent);
+      return true;
+    } else {
+      return super.onOptionsItemSelected(item);
+    }
+  }
+
+  @Override
+  public void onPause() {
+    super.onPause();
+    String room = roomEditText.getText().toString();
+    String roomListJson = new JSONArray(roomList).toString();
+    SharedPreferences.Editor editor = sharedPref.edit();
+    editor.putString(keyprefRoom, room);
+    editor.putString(keyprefRoomList, roomListJson);
+    editor.commit();
+  }
+
+  @Override
+  public void onResume() {
+    super.onResume();
+    String room = sharedPref.getString(keyprefRoom, "");
+    roomEditText.setText(room);
+    roomList = new ArrayList<String>();
+    String roomListJson = sharedPref.getString(keyprefRoomList, null);
+    if (roomListJson != null) {
+      try {
+        JSONArray jsonArray = new JSONArray(roomListJson);
+        for (int i = 0; i < jsonArray.length(); i++) {
+          roomList.add(jsonArray.get(i).toString());
+        }
+      } catch (JSONException e) {
+        Log.e(TAG, "Failed to load room list: " + e.toString());
+      }
+    }
+    adapter = new ArrayAdapter<String>(
+        this, android.R.layout.simple_list_item_1, roomList);
+    roomListView.setAdapter(adapter);
+    if (adapter.getCount() > 0) {
+      roomListView.requestFocus();
+      roomListView.setItemChecked(0, true);
+    }
+  }
+
+  @Override
+  protected void onActivityResult(
+      int requestCode, int resultCode, Intent data) {
+    if (requestCode == CONNECTION_REQUEST && commandLineRun) {
+      Log.d(TAG, "Return: " + resultCode);
+      setResult(resultCode);
+      commandLineRun = false;
+      finish();
+    }
+  }
+
+  private final OnClickListener connectListener = new OnClickListener() {
+    @Override
+    public void onClick(View view) {
+      boolean loopback = false;
+      if (view.getId() == R.id.connect_loopback_button) {
+        loopback = true;
+      }
+      commandLineRun = false;
+      connectToRoom(loopback, 0);
+    }
+  };
+
+  private void connectToRoom(boolean loopback, int runTimeMs) {
+    // Get room name (random for loopback).
+    String roomId;
+    if (loopback) {
+      roomId = Integer.toString((new Random()).nextInt(100000000));
+    } else {
+      roomId = getSelectedItem();
+      if (roomId == null) {
+        roomId = roomEditText.getText().toString();
+      }
+    }
+
+    String roomUrl = sharedPref.getString(
+        keyprefRoomServerUrl,
+        getString(R.string.pref_room_server_url_default));
+
+    // Video call enabled flag.
+    boolean videoCallEnabled = sharedPref.getBoolean(keyprefVideoCallEnabled,
+        Boolean.valueOf(getString(R.string.pref_videocall_default)));
+
+    // Get default codecs.
+    String videoCodec = sharedPref.getString(keyprefVideoCodec,
+        getString(R.string.pref_videocodec_default));
+    String audioCodec = sharedPref.getString(keyprefAudioCodec,
+        getString(R.string.pref_audiocodec_default));
+
+    // Check HW codec flag.
+    boolean hwCodec = sharedPref.getBoolean(keyprefHwCodecAcceleration,
+        Boolean.valueOf(getString(R.string.pref_hwcodec_default)));
+
+    // Check Disable Audio Processing flag.
+    boolean noAudioProcessing = sharedPref.getBoolean(
+        keyprefNoAudioProcessingPipeline,
+        Boolean.valueOf(getString(R.string.pref_noaudioprocessing_default)));
+
+    // Get video resolution from settings.
+    int videoWidth = 0;
+    int videoHeight = 0;
+    String resolution = sharedPref.getString(keyprefResolution,
+        getString(R.string.pref_resolution_default));
+    String[] dimensions = resolution.split("[ x]+");
+    if (dimensions.length == 2) {
+      try {
+        videoWidth = Integer.parseInt(dimensions[0]);
+        videoHeight = Integer.parseInt(dimensions[1]);
+      } catch (NumberFormatException e) {
+        videoWidth = 0;
+        videoHeight = 0;
+        Log.e(TAG, "Wrong video resolution setting: " + resolution);
+      }
+    }
+
+    // Get camera fps from settings.
+    int cameraFps = 0;
+    String fps = sharedPref.getString(keyprefFps,
+        getString(R.string.pref_fps_default));
+    String[] fpsValues = fps.split("[ x]+");
+    if (fpsValues.length == 2) {
+      try {
+        cameraFps = Integer.parseInt(fpsValues[0]);
+      } catch (NumberFormatException e) {
+        Log.e(TAG, "Wrong camera fps setting: " + fps);
+      }
+    }
+
+    // Get video and audio start bitrate.
+    int videoStartBitrate = 0;
+    String bitrateTypeDefault = getString(
+        R.string.pref_startvideobitrate_default);
+    String bitrateType = sharedPref.getString(
+        keyprefVideoBitrateType, bitrateTypeDefault);
+    if (!bitrateType.equals(bitrateTypeDefault)) {
+      String bitrateValue = sharedPref.getString(keyprefVideoBitrateValue,
+          getString(R.string.pref_startvideobitratevalue_default));
+      videoStartBitrate = Integer.parseInt(bitrateValue);
+    }
+    int audioStartBitrate = 0;
+    bitrateTypeDefault = getString(R.string.pref_startaudiobitrate_default);
+    bitrateType = sharedPref.getString(
+        keyprefAudioBitrateType, bitrateTypeDefault);
+    if (!bitrateType.equals(bitrateTypeDefault)) {
+      String bitrateValue = sharedPref.getString(keyprefAudioBitrateValue,
+          getString(R.string.pref_startaudiobitratevalue_default));
+      audioStartBitrate = Integer.parseInt(bitrateValue);
+    }
+
+    // Test if CpuOveruseDetection should be disabled. By default is on.
+    boolean cpuOveruseDetection = sharedPref.getBoolean(
+        keyprefCpuUsageDetection,
+        Boolean.valueOf(
+            getString(R.string.pref_cpu_usage_detection_default)));
+
+    // Check statistics display option.
+    boolean displayHud = sharedPref.getBoolean(keyprefDisplayHud,
+        Boolean.valueOf(getString(R.string.pref_displayhud_default)));
+
+    // Start AppRTCDemo activity.
+    Log.d(TAG, "Connecting to room " + roomId + " at URL " + roomUrl);
+    if (validateUrl(roomUrl)) {
+      Uri uri = Uri.parse(roomUrl);
+      Intent intent = new Intent(this, CallActivity.class);
+      intent.setData(uri);
+      intent.putExtra(CallActivity.EXTRA_ROOMID, roomId);
+      intent.putExtra(CallActivity.EXTRA_LOOPBACK, loopback);
+      intent.putExtra(CallActivity.EXTRA_VIDEO_CALL, videoCallEnabled);
+      intent.putExtra(CallActivity.EXTRA_VIDEO_WIDTH, videoWidth);
+      intent.putExtra(CallActivity.EXTRA_VIDEO_HEIGHT, videoHeight);
+      intent.putExtra(CallActivity.EXTRA_VIDEO_FPS, cameraFps);
+      intent.putExtra(CallActivity.EXTRA_VIDEO_BITRATE, videoStartBitrate);
+      intent.putExtra(CallActivity.EXTRA_VIDEOCODEC, videoCodec);
+      intent.putExtra(CallActivity.EXTRA_HWCODEC_ENABLED, hwCodec);
+      intent.putExtra(CallActivity.EXTRA_NOAUDIOPROCESSING_ENABLED,
+          noAudioProcessing);
+      intent.putExtra(CallActivity.EXTRA_AUDIO_BITRATE, audioStartBitrate);
+      intent.putExtra(CallActivity.EXTRA_AUDIOCODEC, audioCodec);
+      intent.putExtra(CallActivity.EXTRA_CPUOVERUSE_DETECTION,
+          cpuOveruseDetection);
+      intent.putExtra(CallActivity.EXTRA_DISPLAY_HUD, displayHud);
+      intent.putExtra(CallActivity.EXTRA_CMDLINE, commandLineRun);
+      intent.putExtra(CallActivity.EXTRA_RUNTIME, runTimeMs);
+
+      startActivityForResult(intent, CONNECTION_REQUEST);
+    }
+  }
+
+  private boolean validateUrl(String url) {
+    if (URLUtil.isHttpsUrl(url) || URLUtil.isHttpUrl(url)) {
+      return true;
+    }
+
+    new AlertDialog.Builder(this)
+        .setTitle(getText(R.string.invalid_url_title))
+        .setMessage(getString(R.string.invalid_url_text, url))
+        .setCancelable(false)
+        .setNeutralButton(R.string.ok, new DialogInterface.OnClickListener() {
+            public void onClick(DialogInterface dialog, int id) {
+              dialog.cancel();
+            }
+          }).create().show();
+    return false;
+  }
+
+  private final OnClickListener addRoomListener = new OnClickListener() {
+    @Override
+    public void onClick(View view) {
+      String newRoom = roomEditText.getText().toString();
+      if (newRoom.length() > 0 && !roomList.contains(newRoom)) {
+        adapter.add(newRoom);
+        adapter.notifyDataSetChanged();
+      }
+    }
+  };
+
+  private final OnClickListener removeRoomListener = new OnClickListener() {
+    @Override
+    public void onClick(View view) {
+      String selectedRoom = getSelectedItem();
+      if (selectedRoom != null) {
+        adapter.remove(selectedRoom);
+        adapter.notifyDataSetChanged();
+      }
+    }
+  };
+
+  private String getSelectedItem() {
+    int position = AdapterView.INVALID_POSITION;
+    if (roomListView.getCheckedItemCount() > 0 && adapter.getCount() > 0) {
+      position = roomListView.getCheckedItemPosition();
+      if (position >= adapter.getCount()) {
+        position = AdapterView.INVALID_POSITION;
+      }
+    }
+    if (position != AdapterView.INVALID_POSITION) {
+      return adapter.getItem(position);
+    } else {
+      return null;
+    }
+  }
+
+}
diff --git a/examples/androidapp/src/org/appspot/apprtc/CpuMonitor.java b/examples/androidapp/src/org/appspot/apprtc/CpuMonitor.java
new file mode 100644
index 0000000..1d54e5e
--- /dev/null
+++ b/examples/androidapp/src/org/appspot/apprtc/CpuMonitor.java
@@ -0,0 +1,299 @@
+/*
+ *  Copyright 2015 The WebRTC Project Authors. All rights reserved.
+ *
+ *  Use of this source code is governed by a BSD-style license
+ *  that can be found in the LICENSE file in the root of the source
+ *  tree. An additional intellectual property rights grant can be found
+ *  in the file PATENTS.  All contributing project authors may
+ *  be found in the AUTHORS file in the root of the source tree.
+ */
+
+package org.appspot.apprtc;
+
+import android.util.Log;
+
+import java.io.BufferedReader;
+import java.io.FileNotFoundException;
+import java.io.FileReader;
+import java.io.IOException;
+import java.util.InputMismatchException;
+import java.util.Scanner;
+
+/**
+ * Simple CPU monitor.  The caller creates a CpuMonitor object which can then
+ * be used via sampleCpuUtilization() to collect the percentual use of the
+ * cumulative CPU capacity for all CPUs running at their nominal frequency.  3
+ * values are generated: (1) getCpuCurrent() returns the use since the last
+ * sampleCpuUtilization(), (2) getCpuAvg3() returns the use since 3 prior
+ * calls, and (3) getCpuAvgAll() returns the use over all SAMPLE_SAVE_NUMBER
+ * calls.
+ *
+ * <p>CPUs in Android are often "offline", and while this of course means 0 Hz
+ * as current frequency, in this state we cannot even get their nominal
+ * frequency.  We therefore tread carefully, and allow any CPU to be missing.
+ * Missing CPUs are assumed to have the same nominal frequency as any close
+ * lower-numbered CPU, but as soon as it is online, we'll get their proper
+ * frequency and remember it.  (Since CPU 0 in practice always seem to be
+ * online, this unidirectional frequency inheritance should be no problem in
+ * practice.)
+ *
+ * <p>Caveats:
+ *   o No provision made for zany "turbo" mode, common in the x86 world.
+ *   o No provision made for ARM big.LITTLE; if CPU n can switch behind our
+ *     back, we might get incorrect estimates.
+ *   o This is not thread-safe.  To call asynchronously, create different
+ *     CpuMonitor objects.
+ *
+ * <p>If we can gather enough info to generate a sensible result,
+ * sampleCpuUtilization returns true.  It is designed to never through an
+ * exception.
+ *
+ * <p>sampleCpuUtilization should not be called too often in its present form,
+ * since then deltas would be small and the percent values would fluctuate and
+ * be unreadable. If it is desirable to call it more often than say once per
+ * second, one would need to increase SAMPLE_SAVE_NUMBER and probably use
+ * Queue<Integer> to avoid copying overhead.
+ *
+ * <p>Known problems:
+ *   1. Nexus 7 devices running Kitkat have a kernel which often output an
+ *      incorrect 'idle' field in /proc/stat.  The value is close to twice the
+ *      correct value, and then returns to back to correct reading.  Both when
+ *      jumping up and back down we might create faulty CPU load readings.
+ */
+
+class CpuMonitor {
+  private static final int SAMPLE_SAVE_NUMBER = 10;  // Assumed to be >= 3.
+  private int[] percentVec = new int[SAMPLE_SAVE_NUMBER];
+  private int sum3 = 0;
+  private int sum10 = 0;
+  private static final String TAG = "CpuMonitor";
+  private long[] cpuFreq;
+  private int cpusPresent;
+  private double lastPercentFreq = -1;
+  private int cpuCurrent;
+  private int cpuAvg3;
+  private int cpuAvgAll;
+  private boolean initialized = false;
+  private String[] maxPath;
+  private String[] curPath;
+  ProcStat lastProcStat;
+
+  private class ProcStat {
+    final long runTime;
+    final long idleTime;
+
+    ProcStat(long aRunTime, long aIdleTime) {
+      runTime = aRunTime;
+      idleTime = aIdleTime;
+    }
+  }
+
+  private void init() {
+    try {
+      FileReader fin = new FileReader("/sys/devices/system/cpu/present");
+      try {
+        BufferedReader rdr = new BufferedReader(fin);
+        Scanner scanner = new Scanner(rdr).useDelimiter("[-\n]");
+        scanner.nextInt();  // Skip leading number 0.
+        cpusPresent = 1 + scanner.nextInt();
+        scanner.close();
+      } catch (Exception e) {
+        Log.e(TAG, "Cannot do CPU stats due to /sys/devices/system/cpu/present parsing problem");
+      } finally {
+        fin.close();
+      }
+    } catch (FileNotFoundException e) {
+      Log.e(TAG, "Cannot do CPU stats since /sys/devices/system/cpu/present is missing");
+    } catch (IOException e) {
+      Log.e(TAG, "Error closing file");
+    }
+
+    cpuFreq = new long [cpusPresent];
+    maxPath = new String [cpusPresent];
+    curPath = new String [cpusPresent];
+    for (int i = 0; i < cpusPresent; i++) {
+      cpuFreq[i] = 0;  // Frequency "not yet determined".
+      maxPath[i] = "/sys/devices/system/cpu/cpu" + i + "/cpufreq/cpuinfo_max_freq";
+      curPath[i] = "/sys/devices/system/cpu/cpu" + i + "/cpufreq/scaling_cur_freq";
+    }
+
+    lastProcStat = new ProcStat(0, 0);
+
+    initialized = true;
+  }
+
+  /**
+   * Re-measure CPU use.  Call this method at an interval of around 1/s.
+   * This method returns true on success.  The fields
+   * cpuCurrent, cpuAvg3, and cpuAvgAll are updated on success, and represents:
+   * cpuCurrent: The CPU use since the last sampleCpuUtilization call.
+   * cpuAvg3: The average CPU over the last 3 calls.
+   * cpuAvgAll: The average CPU over the last SAMPLE_SAVE_NUMBER calls.
+   */
+  public boolean sampleCpuUtilization() {
+    long lastSeenMaxFreq = 0;
+    long cpufreqCurSum = 0;
+    long cpufreqMaxSum = 0;
+
+    if (!initialized) {
+      init();
+    }
+
+    for (int i = 0; i < cpusPresent; i++) {
+      /*
+       * For each CPU, attempt to first read its max frequency, then its
+       * current frequency.  Once as the max frequency for a CPU is found,
+       * save it in cpuFreq[].
+       */
+
+      if (cpuFreq[i] == 0) {
+        // We have never found this CPU's max frequency.  Attempt to read it.
+        long cpufreqMax = readFreqFromFile(maxPath[i]);
+        if (cpufreqMax > 0) {
+          lastSeenMaxFreq = cpufreqMax;
+          cpuFreq[i] = cpufreqMax;
+          maxPath[i] = null;  // Kill path to free its memory.
+        }
+      } else {
+        lastSeenMaxFreq = cpuFreq[i];  // A valid, previously read value.
+      }
+
+      long cpufreqCur = readFreqFromFile(curPath[i]);
+      cpufreqCurSum += cpufreqCur;
+
+      /* Here, lastSeenMaxFreq might come from
+       * 1. cpuFreq[i], or
+       * 2. a previous iteration, or
+       * 3. a newly read value, or
+       * 4. hypothetically from the pre-loop dummy.
+       */
+      cpufreqMaxSum += lastSeenMaxFreq;
+    }
+
+    if (cpufreqMaxSum == 0) {
+      Log.e(TAG, "Could not read max frequency for any CPU");
+      return false;
+    }
+
+    /*
+     * Since the cycle counts are for the period between the last invocation
+     * and this present one, we average the percentual CPU frequencies between
+     * now and the beginning of the measurement period.  This is significantly
+     * incorrect only if the frequencies have peeked or dropped in between the
+     * invocations.
+     */
+    double newPercentFreq = 100.0 * cpufreqCurSum / cpufreqMaxSum;
+    double percentFreq;
+    if (lastPercentFreq > 0) {
+      percentFreq = (lastPercentFreq + newPercentFreq) * 0.5;
+    } else {
+      percentFreq = newPercentFreq;
+    }
+    lastPercentFreq = newPercentFreq;
+
+    ProcStat procStat = readIdleAndRunTime();
+    if (procStat == null) {
+      return false;
+    }
+
+    long diffRunTime = procStat.runTime - lastProcStat.runTime;
+    long diffIdleTime = procStat.idleTime - lastProcStat.idleTime;
+
+    // Save new measurements for next round's deltas.
+    lastProcStat = procStat;
+
+    long allTime = diffRunTime + diffIdleTime;
+    int percent = allTime == 0 ? 0 : (int) Math.round(percentFreq * diffRunTime / allTime);
+    percent = Math.max(0, Math.min(percent, 100));
+
+    // Subtract old relevant measurement, add newest.
+    sum3 += percent - percentVec[2];
+    // Subtract oldest measurement, add newest.
+    sum10 += percent - percentVec[SAMPLE_SAVE_NUMBER - 1];
+
+    // Rotate saved percent values, save new measurement in vacated spot.
+    for (int i = SAMPLE_SAVE_NUMBER - 1; i > 0; i--) {
+      percentVec[i] = percentVec[i - 1];
+    }
+    percentVec[0] = percent;
+
+    cpuCurrent = percent;
+    cpuAvg3 = sum3 / 3;
+    cpuAvgAll = sum10 / SAMPLE_SAVE_NUMBER;
+
+    return true;
+  }
+
+  public int getCpuCurrent() {
+    return cpuCurrent;
+  }
+
+  public int getCpuAvg3() {
+    return cpuAvg3;
+  }
+
+  public int getCpuAvgAll() {
+    return cpuAvgAll;
+  }
+
+  /**
+   * Read a single integer value from the named file.  Return the read value
+   * or if an error occurs return 0.
+   */
+  private long readFreqFromFile(String fileName) {
+    long number = 0;
+    try {
+      FileReader fin = new FileReader(fileName);
+      try {
+        BufferedReader rdr = new BufferedReader(fin);
+        Scanner scannerC = new Scanner(rdr);
+        number = scannerC.nextLong();
+        scannerC.close();
+      } catch (Exception e) {
+        // CPU presumably got offline just after we opened file.
+      } finally {
+        fin.close();
+      }
+    } catch (FileNotFoundException e) {
+      // CPU is offline, not an error.
+    } catch (IOException e) {
+      Log.e(TAG, "Error closing file");
+    }
+    return number;
+  }
+
+  /*
+   * Read the current utilization of all CPUs using the cumulative first line
+   * of /proc/stat.
+   */
+  private ProcStat readIdleAndRunTime() {
+    long runTime = 0;
+    long idleTime = 0;
+    try {
+      FileReader fin = new FileReader("/proc/stat");
+      try {
+        BufferedReader rdr = new BufferedReader(fin);
+        Scanner scanner = new Scanner(rdr);
+        scanner.next();
+        long user = scanner.nextLong();
+        long nice = scanner.nextLong();
+        long sys = scanner.nextLong();
+        runTime = user + nice + sys;
+        idleTime = scanner.nextLong();
+        scanner.close();
+      } catch (Exception e) {
+        Log.e(TAG, "Problems parsing /proc/stat");
+        return null;
+      } finally {
+        fin.close();
+      }
+    } catch (FileNotFoundException e) {
+      Log.e(TAG, "Cannot open /proc/stat for reading");
+      return null;
+    } catch (IOException e) {
+      Log.e(TAG, "Problems reading /proc/stat");
+      return null;
+    }
+    return new ProcStat(runTime, idleTime);
+  }
+}
diff --git a/examples/androidapp/src/org/appspot/apprtc/HudFragment.java b/examples/androidapp/src/org/appspot/apprtc/HudFragment.java
new file mode 100644
index 0000000..cc7015b
--- /dev/null
+++ b/examples/androidapp/src/org/appspot/apprtc/HudFragment.java
@@ -0,0 +1,200 @@
+/*
+ *  Copyright 2015 The WebRTC Project Authors. All rights reserved.
+ *
+ *  Use of this source code is governed by a BSD-style license
+ *  that can be found in the LICENSE file in the root of the source
+ *  tree. An additional intellectual property rights grant can be found
+ *  in the file PATENTS.  All contributing project authors may
+ *  be found in the AUTHORS file in the root of the source tree.
+ */
+
+package org.appspot.apprtc;
+
+import android.app.Fragment;
+import android.os.Bundle;
+import android.util.TypedValue;
+import android.view.LayoutInflater;
+import android.view.View;
+import android.view.ViewGroup;
+import android.widget.ImageButton;
+import android.widget.TextView;
+
+import org.webrtc.StatsReport;
+
+import java.util.HashMap;
+import java.util.Map;
+
+/**
+ * Fragment for HUD statistics display.
+ */
+public class HudFragment extends Fragment {
+  private View controlView;
+  private TextView encoderStatView;
+  private TextView hudViewBwe;
+  private TextView hudViewConnection;
+  private TextView hudViewVideoSend;
+  private TextView hudViewVideoRecv;
+  private ImageButton toggleDebugButton;
+  private boolean videoCallEnabled;
+  private boolean displayHud;
+  private volatile boolean isRunning;
+  private final CpuMonitor cpuMonitor = new CpuMonitor();
+
+  @Override
+  public View onCreateView(LayoutInflater inflater, ViewGroup container,
+      Bundle savedInstanceState) {
+    controlView = inflater.inflate(R.layout.fragment_hud, container, false);
+
+    // Create UI controls.
+    encoderStatView = (TextView) controlView.findViewById(R.id.encoder_stat_call);
+    hudViewBwe = (TextView) controlView.findViewById(R.id.hud_stat_bwe);
+    hudViewConnection = (TextView) controlView.findViewById(R.id.hud_stat_connection);
+    hudViewVideoSend = (TextView) controlView.findViewById(R.id.hud_stat_video_send);
+    hudViewVideoRecv = (TextView) controlView.findViewById(R.id.hud_stat_video_recv);
+    toggleDebugButton = (ImageButton) controlView.findViewById(R.id.button_toggle_debug);
+
+    toggleDebugButton.setOnClickListener(new View.OnClickListener() {
+      @Override
+      public void onClick(View view) {
+        if (displayHud) {
+          int visibility = (hudViewBwe.getVisibility() == View.VISIBLE)
+              ? View.INVISIBLE : View.VISIBLE;
+          hudViewsSetProperties(visibility);
+        }
+      }
+    });
+
+    return controlView;
+  }
+
+  @Override
+  public void onStart() {
+    super.onStart();
+
+    Bundle args = getArguments();
+    if (args != null) {
+      videoCallEnabled = args.getBoolean(CallActivity.EXTRA_VIDEO_CALL, true);
+      displayHud = args.getBoolean(CallActivity.EXTRA_DISPLAY_HUD, false);
+    }
+    int visibility = displayHud ? View.VISIBLE : View.INVISIBLE;
+    encoderStatView.setVisibility(visibility);
+    toggleDebugButton.setVisibility(visibility);
+    hudViewsSetProperties(View.INVISIBLE);
+    isRunning = true;
+  }
+
+  @Override
+  public void onStop() {
+    isRunning = false;
+    super.onStop();
+  }
+
+  private void hudViewsSetProperties(int visibility) {
+    hudViewBwe.setVisibility(visibility);
+    hudViewConnection.setVisibility(visibility);
+    hudViewVideoSend.setVisibility(visibility);
+    hudViewVideoRecv.setVisibility(visibility);
+    hudViewBwe.setTextSize(TypedValue.COMPLEX_UNIT_PT, 5);
+    hudViewConnection.setTextSize(TypedValue.COMPLEX_UNIT_PT, 5);
+    hudViewVideoSend.setTextSize(TypedValue.COMPLEX_UNIT_PT, 5);
+    hudViewVideoRecv.setTextSize(TypedValue.COMPLEX_UNIT_PT, 5);
+  }
+
+  private Map<String, String> getReportMap(StatsReport report) {
+    Map<String, String> reportMap = new HashMap<String, String>();
+    for (StatsReport.Value value : report.values) {
+      reportMap.put(value.name, value.value);
+    }
+    return reportMap;
+  }
+
+  public void updateEncoderStatistics(final StatsReport[] reports) {
+    if (!isRunning || !displayHud) {
+      return;
+    }
+    StringBuilder encoderStat = new StringBuilder(128);
+    StringBuilder bweStat = new StringBuilder();
+    StringBuilder connectionStat = new StringBuilder();
+    StringBuilder videoSendStat = new StringBuilder();
+    StringBuilder videoRecvStat = new StringBuilder();
+    String fps = null;
+    String targetBitrate = null;
+    String actualBitrate = null;
+
+    for (StatsReport report : reports) {
+      if (report.type.equals("ssrc") && report.id.contains("ssrc")
+          && report.id.contains("send")) {
+        // Send video statistics.
+        Map<String, String> reportMap = getReportMap(report);
+        String trackId = reportMap.get("googTrackId");
+        if (trackId != null && trackId.contains(PeerConnectionClient.VIDEO_TRACK_ID)) {
+          fps = reportMap.get("googFrameRateSent");
+          videoSendStat.append(report.id).append("\n");
+          for (StatsReport.Value value : report.values) {
+            String name = value.name.replace("goog", "");
+            videoSendStat.append(name).append("=").append(value.value).append("\n");
+          }
+        }
+      } else if (report.type.equals("ssrc") && report.id.contains("ssrc")
+          && report.id.contains("recv")) {
+        // Receive video statistics.
+        Map<String, String> reportMap = getReportMap(report);
+        // Check if this stat is for video track.
+        String frameWidth = reportMap.get("googFrameWidthReceived");
+        if (frameWidth != null) {
+          videoRecvStat.append(report.id).append("\n");
+          for (StatsReport.Value value : report.values) {
+            String name = value.name.replace("goog", "");
+            videoRecvStat.append(name).append("=").append(value.value).append("\n");
+          }
+        }
+      } else if (report.id.equals("bweforvideo")) {
+        // BWE statistics.
+        Map<String, String> reportMap = getReportMap(report);
+        targetBitrate = reportMap.get("googTargetEncBitrate");
+        actualBitrate = reportMap.get("googActualEncBitrate");
+
+        bweStat.append(report.id).append("\n");
+        for (StatsReport.Value value : report.values) {
+          String name = value.name.replace("goog", "").replace("Available", "");
+          bweStat.append(name).append("=").append(value.value).append("\n");
+        }
+      } else if (report.type.equals("googCandidatePair")) {
+        // Connection statistics.
+        Map<String, String> reportMap = getReportMap(report);
+        String activeConnection = reportMap.get("googActiveConnection");
+        if (activeConnection != null && activeConnection.equals("true")) {
+          connectionStat.append(report.id).append("\n");
+          for (StatsReport.Value value : report.values) {
+            String name = value.name.replace("goog", "");
+            connectionStat.append(name).append("=").append(value.value).append("\n");
+          }
+        }
+      }
+    }
+    hudViewBwe.setText(bweStat.toString());
+    hudViewConnection.setText(connectionStat.toString());
+    hudViewVideoSend.setText(videoSendStat.toString());
+    hudViewVideoRecv.setText(videoRecvStat.toString());
+
+    if (videoCallEnabled) {
+      if (fps != null) {
+        encoderStat.append("Fps:  ").append(fps).append("\n");
+      }
+      if (targetBitrate != null) {
+        encoderStat.append("Target BR: ").append(targetBitrate).append("\n");
+      }
+      if (actualBitrate != null) {
+        encoderStat.append("Actual BR: ").append(actualBitrate).append("\n");
+      }
+    }
+
+    if (cpuMonitor.sampleCpuUtilization()) {
+      encoderStat.append("CPU%: ")
+          .append(cpuMonitor.getCpuCurrent()).append("/")
+          .append(cpuMonitor.getCpuAvg3()).append("/")
+          .append(cpuMonitor.getCpuAvgAll());
+    }
+    encoderStatView.setText(encoderStat.toString());
+  }
+}
diff --git a/examples/androidapp/src/org/appspot/apprtc/PeerConnectionClient.java b/examples/androidapp/src/org/appspot/apprtc/PeerConnectionClient.java
new file mode 100644
index 0000000..51ef32e
--- /dev/null
+++ b/examples/androidapp/src/org/appspot/apprtc/PeerConnectionClient.java
@@ -0,0 +1,1016 @@
+/*
+ *  Copyright 2014 The WebRTC Project Authors. All rights reserved.
+ *
+ *  Use of this source code is governed by a BSD-style license
+ *  that can be found in the LICENSE file in the root of the source
+ *  tree. An additional intellectual property rights grant can be found
+ *  in the file PATENTS.  All contributing project authors may
+ *  be found in the AUTHORS file in the root of the source tree.
+ */
+
+package org.appspot.apprtc;
+
+import android.content.Context;
+import android.opengl.EGLContext;
+import android.util.Log;
+
+import org.appspot.apprtc.AppRTCClient.SignalingParameters;
+import org.appspot.apprtc.util.LooperExecutor;
+import org.webrtc.DataChannel;
+import org.webrtc.IceCandidate;
+import org.webrtc.Logging;
+import org.webrtc.MediaCodecVideoEncoder;
+import org.webrtc.MediaConstraints;
+import org.webrtc.MediaConstraints.KeyValuePair;
+import org.webrtc.MediaStream;
+import org.webrtc.PeerConnection;
+import org.webrtc.PeerConnection.IceConnectionState;
+import org.webrtc.PeerConnectionFactory;
+import org.webrtc.SdpObserver;
+import org.webrtc.SessionDescription;
+import org.webrtc.StatsObserver;
+import org.webrtc.StatsReport;
+import org.webrtc.VideoCapturerAndroid;
+import org.webrtc.VideoRenderer;
+import org.webrtc.VideoSource;
+import org.webrtc.VideoTrack;
+
+import java.util.EnumSet;
+import java.util.LinkedList;
+import java.util.Timer;
+import java.util.TimerTask;
+import java.util.regex.Matcher;
+import java.util.regex.Pattern;
+
+/**
+ * Peer connection client implementation.
+ *
+ * <p>All public methods are routed to local looper thread.
+ * All PeerConnectionEvents callbacks are invoked from the same looper thread.
+ * This class is a singleton.
+ */
+public class PeerConnectionClient {
+  public static final String VIDEO_TRACK_ID = "ARDAMSv0";
+  public static final String AUDIO_TRACK_ID = "ARDAMSa0";
+  private static final String TAG = "PCRTCClient";
+  private static final String FIELD_TRIAL_VP9 = "WebRTC-SupportVP9/Enabled/";
+  private static final String VIDEO_CODEC_VP8 = "VP8";
+  private static final String VIDEO_CODEC_VP9 = "VP9";
+  private static final String VIDEO_CODEC_H264 = "H264";
+  private static final String AUDIO_CODEC_OPUS = "opus";
+  private static final String AUDIO_CODEC_ISAC = "ISAC";
+  private static final String VIDEO_CODEC_PARAM_START_BITRATE =
+      "x-google-start-bitrate";
+  private static final String AUDIO_CODEC_PARAM_BITRATE = "maxaveragebitrate";
+  private static final String AUDIO_ECHO_CANCELLATION_CONSTRAINT = "googEchoCancellation";
+  private static final String AUDIO_AUTO_GAIN_CONTROL_CONSTRAINT= "googAutoGainControl";
+  private static final String AUDIO_HIGH_PASS_FILTER_CONSTRAINT  = "googHighpassFilter";
+  private static final String AUDIO_NOISE_SUPPRESSION_CONSTRAINT = "googNoiseSuppression";
+  private static final String MAX_VIDEO_WIDTH_CONSTRAINT = "maxWidth";
+  private static final String MIN_VIDEO_WIDTH_CONSTRAINT = "minWidth";
+  private static final String MAX_VIDEO_HEIGHT_CONSTRAINT = "maxHeight";
+  private static final String MIN_VIDEO_HEIGHT_CONSTRAINT = "minHeight";
+  private static final String MAX_VIDEO_FPS_CONSTRAINT = "maxFrameRate";
+  private static final String MIN_VIDEO_FPS_CONSTRAINT = "minFrameRate";
+  private static final String DTLS_SRTP_KEY_AGREEMENT_CONSTRAINT = "DtlsSrtpKeyAgreement";
+  private static final int HD_VIDEO_WIDTH = 1280;
+  private static final int HD_VIDEO_HEIGHT = 720;
+  private static final int MAX_VIDEO_WIDTH = 1280;
+  private static final int MAX_VIDEO_HEIGHT = 1280;
+  private static final int MAX_VIDEO_FPS = 30;
+
+  private static final PeerConnectionClient instance = new PeerConnectionClient();
+  private final PCObserver pcObserver = new PCObserver();
+  private final SDPObserver sdpObserver = new SDPObserver();
+  private final LooperExecutor executor;
+
+  private PeerConnectionFactory factory;
+  private PeerConnection peerConnection;
+  PeerConnectionFactory.Options options = null;
+  private VideoSource videoSource;
+  private boolean videoCallEnabled;
+  private boolean preferIsac;
+  private boolean preferH264;
+  private boolean videoSourceStopped;
+  private boolean isError;
+  private Timer statsTimer;
+  private VideoRenderer.Callbacks localRender;
+  private VideoRenderer.Callbacks remoteRender;
+  private SignalingParameters signalingParameters;
+  private MediaConstraints pcConstraints;
+  private MediaConstraints videoConstraints;
+  private MediaConstraints audioConstraints;
+  private MediaConstraints sdpMediaConstraints;
+  private PeerConnectionParameters peerConnectionParameters;
+  // Queued remote ICE candidates are consumed only after both local and
+  // remote descriptions are set. Similarly local ICE candidates are sent to
+  // remote peer after both local and remote description are set.
+  private LinkedList<IceCandidate> queuedRemoteCandidates;
+  private PeerConnectionEvents events;
+  private boolean isInitiator;
+  private SessionDescription localSdp; // either offer or answer SDP
+  private MediaStream mediaStream;
+  private int numberOfCameras;
+  private VideoCapturerAndroid videoCapturer;
+  // enableVideo is set to true if video should be rendered and sent.
+  private boolean renderVideo;
+  private VideoTrack localVideoTrack;
+  private VideoTrack remoteVideoTrack;
+
+  /**
+   * Peer connection parameters.
+   */
+  public static class PeerConnectionParameters {
+    public final boolean videoCallEnabled;
+    public final boolean loopback;
+    public final int videoWidth;
+    public final int videoHeight;
+    public final int videoFps;
+    public final int videoStartBitrate;
+    public final String videoCodec;
+    public final boolean videoCodecHwAcceleration;
+    public final int audioStartBitrate;
+    public final String audioCodec;
+    public final boolean noAudioProcessing;
+    public final boolean cpuOveruseDetection;
+
+    public PeerConnectionParameters(
+        boolean videoCallEnabled, boolean loopback,
+        int videoWidth, int videoHeight, int videoFps, int videoStartBitrate,
+        String videoCodec, boolean videoCodecHwAcceleration,
+        int audioStartBitrate, String audioCodec,
+        boolean noAudioProcessing, boolean cpuOveruseDetection) {
+      this.videoCallEnabled = videoCallEnabled;
+      this.loopback = loopback;
+      this.videoWidth = videoWidth;
+      this.videoHeight = videoHeight;
+      this.videoFps = videoFps;
+      this.videoStartBitrate = videoStartBitrate;
+      this.videoCodec = videoCodec;
+      this.videoCodecHwAcceleration = videoCodecHwAcceleration;
+      this.audioStartBitrate = audioStartBitrate;
+      this.audioCodec = audioCodec;
+      this.noAudioProcessing = noAudioProcessing;
+      this.cpuOveruseDetection = cpuOveruseDetection;
+    }
+  }
+
+  /**
+   * Peer connection events.
+   */
+  public static interface PeerConnectionEvents {
+    /**
+     * Callback fired once local SDP is created and set.
+     */
+    public void onLocalDescription(final SessionDescription sdp);
+
+    /**
+     * Callback fired once local Ice candidate is generated.
+     */
+    public void onIceCandidate(final IceCandidate candidate);
+
+    /**
+     * Callback fired once connection is established (IceConnectionState is
+     * CONNECTED).
+     */
+    public void onIceConnected();
+
+    /**
+     * Callback fired once connection is closed (IceConnectionState is
+     * DISCONNECTED).
+     */
+    public void onIceDisconnected();
+
+    /**
+     * Callback fired once peer connection is closed.
+     */
+    public void onPeerConnectionClosed();
+
+    /**
+     * Callback fired once peer connection statistics is ready.
+     */
+    public void onPeerConnectionStatsReady(final StatsReport[] reports);
+
+    /**
+     * Callback fired once peer connection error happened.
+     */
+    public void onPeerConnectionError(final String description);
+  }
+
+  private PeerConnectionClient() {
+    executor = new LooperExecutor();
+    // Looper thread is started once in private ctor and is used for all
+    // peer connection API calls to ensure new peer connection factory is
+    // created on the same thread as previously destroyed factory.
+    executor.requestStart();
+  }
+
+  public static PeerConnectionClient getInstance() {
+    return instance;
+  }
+
+  public void setPeerConnectionFactoryOptions(PeerConnectionFactory.Options options) {
+    this.options = options;
+  }
+
+  public void createPeerConnectionFactory(
+      final Context context,
+      final EGLContext renderEGLContext,
+      final PeerConnectionParameters peerConnectionParameters,
+      final PeerConnectionEvents events) {
+    this.peerConnectionParameters = peerConnectionParameters;
+    this.events = events;
+    videoCallEnabled = peerConnectionParameters.videoCallEnabled;
+    // Reset variables to initial states.
+    factory = null;
+    peerConnection = null;
+    preferIsac = false;
+    preferH264 = false;
+    videoSourceStopped = false;
+    isError = false;
+    queuedRemoteCandidates = null;
+    localSdp = null; // either offer or answer SDP
+    mediaStream = null;
+    videoCapturer = null;
+    renderVideo = true;
+    localVideoTrack = null;
+    remoteVideoTrack = null;
+    statsTimer = new Timer();
+
+    executor.execute(new Runnable() {
+      @Override
+      public void run() {
+        createPeerConnectionFactoryInternal(context, renderEGLContext);
+      }
+    });
+  }
+
+  public void createPeerConnection(
+      final VideoRenderer.Callbacks localRender,
+      final VideoRenderer.Callbacks remoteRender,
+      final SignalingParameters signalingParameters) {
+    if (peerConnectionParameters == null) {
+      Log.e(TAG, "Creating peer connection without initializing factory.");
+      return;
+    }
+    this.localRender = localRender;
+    this.remoteRender = remoteRender;
+    this.signalingParameters = signalingParameters;
+    executor.execute(new Runnable() {
+      @Override
+      public void run() {
+        createMediaConstraintsInternal();
+        createPeerConnectionInternal();
+      }
+    });
+  }
+
+  public void close() {
+    executor.execute(new Runnable() {
+      @Override
+      public void run() {
+        closeInternal();
+      }
+    });
+  }
+
+  public boolean isVideoCallEnabled() {
+    return videoCallEnabled;
+  }
+
+  private void createPeerConnectionFactoryInternal(
+      Context context, EGLContext renderEGLContext) {
+    Log.d(TAG, "Create peer connection factory with EGLContext "
+        + renderEGLContext + ". Use video: "
+        + peerConnectionParameters.videoCallEnabled);
+    isError = false;
+    // Check if VP9 is used by default.
+    if (videoCallEnabled && peerConnectionParameters.videoCodec != null
+        && peerConnectionParameters.videoCodec.equals(VIDEO_CODEC_VP9)) {
+      PeerConnectionFactory.initializeFieldTrials(FIELD_TRIAL_VP9);
+    } else {
+      PeerConnectionFactory.initializeFieldTrials(null);
+    }
+    // Check if H.264 is used by default.
+    preferH264 = false;
+    if (videoCallEnabled && peerConnectionParameters.videoCodec != null
+        && peerConnectionParameters.videoCodec.equals(VIDEO_CODEC_H264)) {
+      preferH264 = true;
+    }
+    // Check if ISAC is used by default.
+    preferIsac = false;
+    if (peerConnectionParameters.audioCodec != null
+        && peerConnectionParameters.audioCodec.equals(AUDIO_CODEC_ISAC)) {
+      preferIsac = true;
+    }
+    if (!PeerConnectionFactory.initializeAndroidGlobals(
+        context, true, true,
+        peerConnectionParameters.videoCodecHwAcceleration, renderEGLContext)) {
+      events.onPeerConnectionError("Failed to initializeAndroidGlobals");
+    }
+    factory = new PeerConnectionFactory();
+    if (options != null) {
+      Log.d(TAG, "Factory networkIgnoreMask option: " + options.networkIgnoreMask);
+      factory.setOptions(options);
+    }
+    Log.d(TAG, "Peer connection factory created.");
+  }
+
+  private void createMediaConstraintsInternal() {
+    // Create peer connection constraints.
+    pcConstraints = new MediaConstraints();
+    // Enable DTLS for normal calls and disable for loopback calls.
+    if (peerConnectionParameters.loopback) {
+      pcConstraints.optional.add(
+          new MediaConstraints.KeyValuePair(DTLS_SRTP_KEY_AGREEMENT_CONSTRAINT, "false"));
+    } else {
+      pcConstraints.optional.add(
+          new MediaConstraints.KeyValuePair(DTLS_SRTP_KEY_AGREEMENT_CONSTRAINT, "true"));
+    }
+
+    // Check if there is a camera on device and disable video call if not.
+    numberOfCameras = VideoCapturerAndroid.getDeviceCount();
+    if (numberOfCameras == 0) {
+      Log.w(TAG, "No camera on device. Switch to audio only call.");
+      videoCallEnabled = false;
+    }
+    // Create video constraints if video call is enabled.
+    if (videoCallEnabled) {
+      videoConstraints = new MediaConstraints();
+      int videoWidth = peerConnectionParameters.videoWidth;
+      int videoHeight = peerConnectionParameters.videoHeight;
+
+      // If VP8 HW video encoder is supported and video resolution is not
+      // specified force it to HD.
+      if ((videoWidth == 0 || videoHeight == 0)
+          && peerConnectionParameters.videoCodecHwAcceleration
+          && MediaCodecVideoEncoder.isVp8HwSupported()) {
+        videoWidth = HD_VIDEO_WIDTH;
+        videoHeight = HD_VIDEO_HEIGHT;
+      }
+
+      // Add video resolution constraints.
+      if (videoWidth > 0 && videoHeight > 0) {
+        videoWidth = Math.min(videoWidth, MAX_VIDEO_WIDTH);
+        videoHeight = Math.min(videoHeight, MAX_VIDEO_HEIGHT);
+        videoConstraints.mandatory.add(new KeyValuePair(
+            MIN_VIDEO_WIDTH_CONSTRAINT, Integer.toString(videoWidth)));
+        videoConstraints.mandatory.add(new KeyValuePair(
+            MAX_VIDEO_WIDTH_CONSTRAINT, Integer.toString(videoWidth)));
+        videoConstraints.mandatory.add(new KeyValuePair(
+            MIN_VIDEO_HEIGHT_CONSTRAINT, Integer.toString(videoHeight)));
+        videoConstraints.mandatory.add(new KeyValuePair(
+            MAX_VIDEO_HEIGHT_CONSTRAINT, Integer.toString(videoHeight)));
+      }
+
+      // Add fps constraints.
+      int videoFps = peerConnectionParameters.videoFps;
+      if (videoFps > 0) {
+        videoFps = Math.min(videoFps, MAX_VIDEO_FPS);
+        videoConstraints.mandatory.add(new KeyValuePair(
+            MIN_VIDEO_FPS_CONSTRAINT, Integer.toString(videoFps)));
+        videoConstraints.mandatory.add(new KeyValuePair(
+            MAX_VIDEO_FPS_CONSTRAINT, Integer.toString(videoFps)));
+      }
+    }
+
+    // Create audio constraints.
+    audioConstraints = new MediaConstraints();
+    // added for audio performance measurements
+    if (peerConnectionParameters.noAudioProcessing) {
+      Log.d(TAG, "Disabling audio processing");
+      audioConstraints.mandatory.add(new MediaConstraints.KeyValuePair(
+            AUDIO_ECHO_CANCELLATION_CONSTRAINT, "false"));
+      audioConstraints.mandatory.add(new MediaConstraints.KeyValuePair(
+            AUDIO_AUTO_GAIN_CONTROL_CONSTRAINT, "false"));
+      audioConstraints.mandatory.add(new MediaConstraints.KeyValuePair(
+            AUDIO_HIGH_PASS_FILTER_CONSTRAINT, "false"));
+      audioConstraints.mandatory.add(new MediaConstraints.KeyValuePair(
+           AUDIO_NOISE_SUPPRESSION_CONSTRAINT , "false"));
+    }
+    // Create SDP constraints.
+    sdpMediaConstraints = new MediaConstraints();
+    sdpMediaConstraints.mandatory.add(new MediaConstraints.KeyValuePair(
+        "OfferToReceiveAudio", "true"));
+    if (videoCallEnabled || peerConnectionParameters.loopback) {
+      sdpMediaConstraints.mandatory.add(new MediaConstraints.KeyValuePair(
+          "OfferToReceiveVideo", "true"));
+    } else {
+      sdpMediaConstraints.mandatory.add(new MediaConstraints.KeyValuePair(
+          "OfferToReceiveVideo", "false"));
+    }
+  }
+
+  private void createPeerConnectionInternal() {
+    if (factory == null || isError) {
+      Log.e(TAG, "Peerconnection factory is not created");
+      return;
+    }
+    Log.d(TAG, "Create peer connection");
+    Log.d(TAG, "PCConstraints: " + pcConstraints.toString());
+    if (videoConstraints != null) {
+      Log.d(TAG, "VideoConstraints: " + videoConstraints.toString());
+    }
+    queuedRemoteCandidates = new LinkedList<IceCandidate>();
+
+    PeerConnection.RTCConfiguration rtcConfig =
+        new PeerConnection.RTCConfiguration(signalingParameters.iceServers);
+    // TCP candidates are only useful when connecting to a server that supports
+    // ICE-TCP.
+    rtcConfig.tcpCandidatePolicy = PeerConnection.TcpCandidatePolicy.DISABLED;
+    rtcConfig.bundlePolicy = PeerConnection.BundlePolicy.MAXBUNDLE;
+    rtcConfig.rtcpMuxPolicy = PeerConnection.RtcpMuxPolicy.REQUIRE;
+
+    peerConnection = factory.createPeerConnection(
+        rtcConfig, pcConstraints, pcObserver);
+    isInitiator = false;
+
+    // Set default WebRTC tracing and INFO libjingle logging.
+    // NOTE: this _must_ happen while |factory| is alive!
+    Logging.enableTracing(
+        "logcat:",
+        EnumSet.of(Logging.TraceLevel.TRACE_DEFAULT),
+        Logging.Severity.LS_INFO);
+
+    mediaStream = factory.createLocalMediaStream("ARDAMS");
+    if (videoCallEnabled) {
+      String cameraDeviceName = VideoCapturerAndroid.getDeviceName(0);
+      String frontCameraDeviceName =
+          VideoCapturerAndroid.getNameOfFrontFacingDevice();
+      if (numberOfCameras > 1 && frontCameraDeviceName != null) {
+        cameraDeviceName = frontCameraDeviceName;
+      }
+      Log.d(TAG, "Opening camera: " + cameraDeviceName);
+      videoCapturer = VideoCapturerAndroid.create(cameraDeviceName, null);
+      if (videoCapturer == null) {
+        reportError("Failed to open camera");
+        return;
+      }
+      mediaStream.addTrack(createVideoTrack(videoCapturer));
+    }
+
+    mediaStream.addTrack(factory.createAudioTrack(
+        AUDIO_TRACK_ID,
+        factory.createAudioSource(audioConstraints)));
+    peerConnection.addStream(mediaStream);
+
+    Log.d(TAG, "Peer connection created.");
+  }
+
+  private void closeInternal() {
+    Log.d(TAG, "Closing peer connection.");
+    statsTimer.cancel();
+    if (peerConnection != null) {
+      peerConnection.dispose();
+      peerConnection = null;
+    }
+    Log.d(TAG, "Closing video source.");
+    if (videoSource != null) {
+      videoSource.dispose();
+      videoSource = null;
+    }
+    Log.d(TAG, "Closing peer connection factory.");
+    if (factory != null) {
+      factory.dispose();
+      factory = null;
+    }
+    options = null;
+    Log.d(TAG, "Closing peer connection done.");
+    events.onPeerConnectionClosed();
+  }
+
+  public boolean isHDVideo() {
+    if (!videoCallEnabled) {
+      return false;
+    }
+    int minWidth = 0;
+    int minHeight = 0;
+    for (KeyValuePair keyValuePair : videoConstraints.mandatory) {
+      if (keyValuePair.getKey().equals("minWidth")) {
+        try {
+          minWidth = Integer.parseInt(keyValuePair.getValue());
+        } catch (NumberFormatException e) {
+          Log.e(TAG, "Can not parse video width from video constraints");
+        }
+      } else if (keyValuePair.getKey().equals("minHeight")) {
+        try {
+          minHeight = Integer.parseInt(keyValuePair.getValue());
+        } catch (NumberFormatException e) {
+          Log.e(TAG, "Can not parse video height from video constraints");
+        }
+      }
+    }
+    if (minWidth * minHeight >= 1280 * 720) {
+      return true;
+    } else {
+      return false;
+    }
+  }
+
+  private void getStats() {
+    if (peerConnection == null || isError) {
+      return;
+    }
+    boolean success = peerConnection.getStats(new StatsObserver() {
+      @Override
+      public void onComplete(final StatsReport[] reports) {
+        events.onPeerConnectionStatsReady(reports);
+      }
+    }, null);
+    if (!success) {
+      Log.e(TAG, "getStats() returns false!");
+    }
+  }
+
+  public void enableStatsEvents(boolean enable, int periodMs) {
+    if (enable) {
+      try {
+        statsTimer.schedule(new TimerTask() {
+          @Override
+          public void run() {
+            executor.execute(new Runnable() {
+              @Override
+              public void run() {
+                getStats();
+              }
+            });
+          }
+        }, 0, periodMs);
+      } catch (Exception e) {
+        Log.e(TAG, "Can not schedule statistics timer", e);
+      }
+    } else {
+      statsTimer.cancel();
+    }
+  }
+
+  public void setVideoEnabled(final boolean enable) {
+    executor.execute(new Runnable() {
+      @Override
+      public void run() {
+        renderVideo = enable;
+        if (localVideoTrack != null) {
+          localVideoTrack.setEnabled(renderVideo);
+        }
+        if (remoteVideoTrack != null) {
+          remoteVideoTrack.setEnabled(renderVideo);
+        }
+      }
+    });
+  }
+
+  public void createOffer() {
+    executor.execute(new Runnable() {
+      @Override
+      public void run() {
+        if (peerConnection != null && !isError) {
+          Log.d(TAG, "PC Create OFFER");
+          isInitiator = true;
+          peerConnection.createOffer(sdpObserver, sdpMediaConstraints);
+        }
+      }
+    });
+  }
+
+  public void createAnswer() {
+    executor.execute(new Runnable() {
+      @Override
+      public void run() {
+        if (peerConnection != null && !isError) {
+          Log.d(TAG, "PC create ANSWER");
+          isInitiator = false;
+          peerConnection.createAnswer(sdpObserver, sdpMediaConstraints);
+        }
+      }
+    });
+  }
+
+  public void addRemoteIceCandidate(final IceCandidate candidate) {
+    executor.execute(new Runnable() {
+      @Override
+      public void run() {
+        if (peerConnection != null && !isError) {
+          if (queuedRemoteCandidates != null) {
+            queuedRemoteCandidates.add(candidate);
+          } else {
+            peerConnection.addIceCandidate(candidate);
+          }
+        }
+      }
+    });
+  }
+
+  public void setRemoteDescription(final SessionDescription sdp) {
+    executor.execute(new Runnable() {
+      @Override
+      public void run() {
+        if (peerConnection == null || isError) {
+          return;
+        }
+        String sdpDescription = sdp.description;
+        if (preferIsac) {
+          sdpDescription = preferCodec(sdpDescription, AUDIO_CODEC_ISAC, true);
+        }
+        if (videoCallEnabled && preferH264) {
+          sdpDescription = preferCodec(sdpDescription, VIDEO_CODEC_H264, false);
+        }
+        if (videoCallEnabled && peerConnectionParameters.videoStartBitrate > 0) {
+          sdpDescription = setStartBitrate(VIDEO_CODEC_VP8, true,
+              sdpDescription, peerConnectionParameters.videoStartBitrate);
+          sdpDescription = setStartBitrate(VIDEO_CODEC_VP9, true,
+              sdpDescription, peerConnectionParameters.videoStartBitrate);
+          sdpDescription = setStartBitrate(VIDEO_CODEC_H264, true,
+              sdpDescription, peerConnectionParameters.videoStartBitrate);
+        }
+        if (peerConnectionParameters.audioStartBitrate > 0) {
+          sdpDescription = setStartBitrate(AUDIO_CODEC_OPUS, false,
+              sdpDescription, peerConnectionParameters.audioStartBitrate);
+        }
+        Log.d(TAG, "Set remote SDP.");
+        SessionDescription sdpRemote = new SessionDescription(
+            sdp.type, sdpDescription);
+        peerConnection.setRemoteDescription(sdpObserver, sdpRemote);
+      }
+    });
+  }
+
+  public void stopVideoSource() {
+    executor.execute(new Runnable() {
+      @Override
+      public void run() {
+        if (videoSource != null && !videoSourceStopped) {
+          Log.d(TAG, "Stop video source.");
+          videoSource.stop();
+          videoSourceStopped = true;
+        }
+      }
+    });
+  }
+
+  public void startVideoSource() {
+    executor.execute(new Runnable() {
+      @Override
+      public void run() {
+        if (videoSource != null && videoSourceStopped) {
+          Log.d(TAG, "Restart video source.");
+          videoSource.restart();
+          videoSourceStopped = false;
+        }
+      }
+    });
+  }
+
+  private void reportError(final String errorMessage) {
+    Log.e(TAG, "Peerconnection error: " + errorMessage);
+    executor.execute(new Runnable() {
+      @Override
+      public void run() {
+        if (!isError) {
+          events.onPeerConnectionError(errorMessage);
+          isError = true;
+        }
+      }
+    });
+  }
+
+  private VideoTrack createVideoTrack(VideoCapturerAndroid capturer) {
+    videoSource = factory.createVideoSource(capturer, videoConstraints);
+
+    localVideoTrack = factory.createVideoTrack(VIDEO_TRACK_ID, videoSource);
+    localVideoTrack.setEnabled(renderVideo);
+    localVideoTrack.addRenderer(new VideoRenderer(localRender));
+    return localVideoTrack;
+  }
+
+  private static String setStartBitrate(String codec, boolean isVideoCodec,
+      String sdpDescription, int bitrateKbps) {
+    String[] lines = sdpDescription.split("\r\n");
+    int rtpmapLineIndex = -1;
+    boolean sdpFormatUpdated = false;
+    String codecRtpMap = null;
+    // Search for codec rtpmap in format
+    // a=rtpmap:<payload type> <encoding name>/<clock rate> [/<encoding parameters>]
+    String regex = "^a=rtpmap:(\\d+) " + codec + "(/\\d+)+[\r]?$";
+    Pattern codecPattern = Pattern.compile(regex);
+    for (int i = 0; i < lines.length; i++) {
+      Matcher codecMatcher = codecPattern.matcher(lines[i]);
+      if (codecMatcher.matches()) {
+        codecRtpMap = codecMatcher.group(1);
+        rtpmapLineIndex = i;
+        break;
+      }
+    }
+    if (codecRtpMap == null) {
+      Log.w(TAG, "No rtpmap for " + codec + " codec");
+      return sdpDescription;
+    }
+    Log.d(TAG, "Found " +  codec + " rtpmap " + codecRtpMap
+        + " at " + lines[rtpmapLineIndex]);
+
+    // Check if a=fmtp string already exist in remote SDP for this codec and
+    // update it with new bitrate parameter.
+    regex = "^a=fmtp:" + codecRtpMap + " \\w+=\\d+.*[\r]?$";
+    codecPattern = Pattern.compile(regex);
+    for (int i = 0; i < lines.length; i++) {
+      Matcher codecMatcher = codecPattern.matcher(lines[i]);
+      if (codecMatcher.matches()) {
+        Log.d(TAG, "Found " +  codec + " " + lines[i]);
+        if (isVideoCodec) {
+          lines[i] += "; " + VIDEO_CODEC_PARAM_START_BITRATE
+              + "=" + bitrateKbps;
+        } else {
+          lines[i] += "; " + AUDIO_CODEC_PARAM_BITRATE
+              + "=" + (bitrateKbps * 1000);
+        }
+        Log.d(TAG, "Update remote SDP line: " + lines[i]);
+        sdpFormatUpdated = true;
+        break;
+      }
+    }
+
+    StringBuilder newSdpDescription = new StringBuilder();
+    for (int i = 0; i < lines.length; i++) {
+      newSdpDescription.append(lines[i]).append("\r\n");
+      // Append new a=fmtp line if no such line exist for a codec.
+      if (!sdpFormatUpdated && i == rtpmapLineIndex) {
+        String bitrateSet;
+        if (isVideoCodec) {
+          bitrateSet = "a=fmtp:" + codecRtpMap + " "
+              + VIDEO_CODEC_PARAM_START_BITRATE + "=" + bitrateKbps;
+        } else {
+          bitrateSet = "a=fmtp:" + codecRtpMap + " "
+              + AUDIO_CODEC_PARAM_BITRATE + "=" + (bitrateKbps * 1000);
+        }
+        Log.d(TAG, "Add remote SDP line: " + bitrateSet);
+        newSdpDescription.append(bitrateSet).append("\r\n");
+      }
+
+    }
+    return newSdpDescription.toString();
+  }
+
+  private static String preferCodec(
+      String sdpDescription, String codec, boolean isAudio) {
+    String[] lines = sdpDescription.split("\r\n");
+    int mLineIndex = -1;
+    String codecRtpMap = null;
+    // a=rtpmap:<payload type> <encoding name>/<clock rate> [/<encoding parameters>]
+    String regex = "^a=rtpmap:(\\d+) " + codec + "(/\\d+)+[\r]?$";
+    Pattern codecPattern = Pattern.compile(regex);
+    String mediaDescription = "m=video ";
+    if (isAudio) {
+      mediaDescription = "m=audio ";
+    }
+    for (int i = 0; (i < lines.length)
+        && (mLineIndex == -1 || codecRtpMap == null); i++) {
+      if (lines[i].startsWith(mediaDescription)) {
+        mLineIndex = i;
+        continue;
+      }
+      Matcher codecMatcher = codecPattern.matcher(lines[i]);
+      if (codecMatcher.matches()) {
+        codecRtpMap = codecMatcher.group(1);
+        continue;
+      }
+    }
+    if (mLineIndex == -1) {
+      Log.w(TAG, "No " + mediaDescription + " line, so can't prefer " + codec);
+      return sdpDescription;
+    }
+    if (codecRtpMap == null) {
+      Log.w(TAG, "No rtpmap for " + codec);
+      return sdpDescription;
+    }
+    Log.d(TAG, "Found " +  codec + " rtpmap " + codecRtpMap + ", prefer at "
+        + lines[mLineIndex]);
+    String[] origMLineParts = lines[mLineIndex].split(" ");
+    if (origMLineParts.length > 3) {
+      StringBuilder newMLine = new StringBuilder();
+      int origPartIndex = 0;
+      // Format is: m=<media> <port> <proto> <fmt> ...
+      newMLine.append(origMLineParts[origPartIndex++]).append(" ");
+      newMLine.append(origMLineParts[origPartIndex++]).append(" ");
+      newMLine.append(origMLineParts[origPartIndex++]).append(" ");
+      newMLine.append(codecRtpMap);
+      for (; origPartIndex < origMLineParts.length; origPartIndex++) {
+        if (!origMLineParts[origPartIndex].equals(codecRtpMap)) {
+          newMLine.append(" ").append(origMLineParts[origPartIndex]);
+        }
+      }
+      lines[mLineIndex] = newMLine.toString();
+      Log.d(TAG, "Change media description: " + lines[mLineIndex]);
+    } else {
+      Log.e(TAG, "Wrong SDP media description format: " + lines[mLineIndex]);
+    }
+    StringBuilder newSdpDescription = new StringBuilder();
+    for (String line : lines) {
+      newSdpDescription.append(line).append("\r\n");
+    }
+    return newSdpDescription.toString();
+  }
+
+  private void drainCandidates() {
+    if (queuedRemoteCandidates != null) {
+      Log.d(TAG, "Add " + queuedRemoteCandidates.size() + " remote candidates");
+      for (IceCandidate candidate : queuedRemoteCandidates) {
+        peerConnection.addIceCandidate(candidate);
+      }
+      queuedRemoteCandidates = null;
+    }
+  }
+
+  private void switchCameraInternal() {
+    if (!videoCallEnabled || numberOfCameras < 2 || isError || videoCapturer == null) {
+      Log.e(TAG, "Failed to switch camera. Video: " + videoCallEnabled + ". Error : "
+          + isError + ". Number of cameras: " + numberOfCameras);
+      return;  // No video is sent or only one camera is available or error happened.
+    }
+    Log.d(TAG, "Switch camera");
+    videoCapturer.switchCamera(null);
+  }
+
+  public void switchCamera() {
+    executor.execute(new Runnable() {
+      @Override
+      public void run() {
+        switchCameraInternal();
+      }
+    });
+  }
+
+  // Implementation detail: observe ICE & stream changes and react accordingly.
+  private class PCObserver implements PeerConnection.Observer {
+    @Override
+    public void onIceCandidate(final IceCandidate candidate){
+      executor.execute(new Runnable() {
+        @Override
+        public void run() {
+          events.onIceCandidate(candidate);
+        }
+      });
+    }
+
+    @Override
+    public void onSignalingChange(
+        PeerConnection.SignalingState newState) {
+      Log.d(TAG, "SignalingState: " + newState);
+    }
+
+    @Override
+    public void onIceConnectionChange(
+        final PeerConnection.IceConnectionState newState) {
+      executor.execute(new Runnable() {
+        @Override
+        public void run() {
+          Log.d(TAG, "IceConnectionState: " + newState);
+          if (newState == IceConnectionState.CONNECTED) {
+            events.onIceConnected();
+          } else if (newState == IceConnectionState.DISCONNECTED) {
+            events.onIceDisconnected();
+          } else if (newState == IceConnectionState.FAILED) {
+            reportError("ICE connection failed.");
+          }
+        }
+      });
+    }
+
+    @Override
+    public void onIceGatheringChange(
+      PeerConnection.IceGatheringState newState) {
+      Log.d(TAG, "IceGatheringState: " + newState);
+    }
+
+    @Override
+    public void onIceConnectionReceivingChange(boolean receiving) {
+      Log.d(TAG, "IceConnectionReceiving changed to " + receiving);
+    }
+
+    @Override
+    public void onAddStream(final MediaStream stream){
+      executor.execute(new Runnable() {
+        @Override
+        public void run() {
+          if (peerConnection == null || isError) {
+            return;
+          }
+          if (stream.audioTracks.size() > 1 || stream.videoTracks.size() > 1) {
+            reportError("Weird-looking stream: " + stream);
+            return;
+          }
+          if (stream.videoTracks.size() == 1) {
+            remoteVideoTrack = stream.videoTracks.get(0);
+            remoteVideoTrack.setEnabled(renderVideo);
+            remoteVideoTrack.addRenderer(new VideoRenderer(remoteRender));
+          }
+        }
+      });
+    }
+
+    @Override
+    public void onRemoveStream(final MediaStream stream){
+      executor.execute(new Runnable() {
+        @Override
+        public void run() {
+          if (peerConnection == null || isError) {
+            return;
+          }
+          remoteVideoTrack = null;
+          stream.videoTracks.get(0).dispose();
+        }
+      });
+    }
+
+    @Override
+    public void onDataChannel(final DataChannel dc) {
+      reportError("AppRTC doesn't use data channels, but got: " + dc.label()
+          + " anyway!");
+    }
+
+    @Override
+    public void onRenegotiationNeeded() {
+      // No need to do anything; AppRTC follows a pre-agreed-upon
+      // signaling/negotiation protocol.
+    }
+  }
+
+  // Implementation detail: handle offer creation/signaling and answer setting,
+  // as well as adding remote ICE candidates once the answer SDP is set.
+  private class SDPObserver implements SdpObserver {
+    @Override
+    public void onCreateSuccess(final SessionDescription origSdp) {
+      if (localSdp != null) {
+        reportError("Multiple SDP create.");
+        return;
+      }
+      String sdpDescription = origSdp.description;
+      if (preferIsac) {
+        sdpDescription = preferCodec(sdpDescription, AUDIO_CODEC_ISAC, true);
+      }
+      if (videoCallEnabled && preferH264) {
+        sdpDescription = preferCodec(sdpDescription, VIDEO_CODEC_H264, false);
+      }
+      final SessionDescription sdp = new SessionDescription(
+          origSdp.type, sdpDescription);
+      localSdp = sdp;
+      executor.execute(new Runnable() {
+        @Override
+        public void run() {
+          if (peerConnection != null && !isError) {
+            Log.d(TAG, "Set local SDP from " + sdp.type);
+            peerConnection.setLocalDescription(sdpObserver, sdp);
+          }
+        }
+      });
+    }
+
+    @Override
+    public void onSetSuccess() {
+      executor.execute(new Runnable() {
+        @Override
+        public void run() {
+          if (peerConnection == null || isError) {
+            return;
+          }
+          if (isInitiator) {
+            // For offering peer connection we first create offer and set
+            // local SDP, then after receiving answer set remote SDP.
+            if (peerConnection.getRemoteDescription() == null) {
+              // We've just set our local SDP so time to send it.
+              Log.d(TAG, "Local SDP set succesfully");
+              events.onLocalDescription(localSdp);
+            } else {
+              // We've just set remote description, so drain remote
+              // and send local ICE candidates.
+              Log.d(TAG, "Remote SDP set succesfully");
+              drainCandidates();
+            }
+          } else {
+            // For answering peer connection we set remote SDP and then
+            // create answer and set local SDP.
+            if (peerConnection.getLocalDescription() != null) {
+              // We've just set our local SDP so time to send it, drain
+              // remote and send local ICE candidates.
+              Log.d(TAG, "Local SDP set succesfully");
+              events.onLocalDescription(localSdp);
+              drainCandidates();
+            } else {
+              // We've just set remote SDP - do nothing for now -
+              // answer will be created soon.
+              Log.d(TAG, "Remote SDP set succesfully");
+            }
+          }
+        }
+      });
+    }
+
+    @Override
+    public void onCreateFailure(final String error) {
+      reportError("createSDP error: " + error);
+    }
+
+    @Override
+    public void onSetFailure(final String error) {
+      reportError("setSDP error: " + error);
+    }
+  }
+}
diff --git a/examples/androidapp/src/org/appspot/apprtc/RoomParametersFetcher.java b/examples/androidapp/src/org/appspot/apprtc/RoomParametersFetcher.java
new file mode 100644
index 0000000..a751f92
--- /dev/null
+++ b/examples/androidapp/src/org/appspot/apprtc/RoomParametersFetcher.java
@@ -0,0 +1,222 @@
+/*
+ *  Copyright 2014 The WebRTC Project Authors. All rights reserved.
+ *
+ *  Use of this source code is governed by a BSD-style license
+ *  that can be found in the LICENSE file in the root of the source
+ *  tree. An additional intellectual property rights grant can be found
+ *  in the file PATENTS.  All contributing project authors may
+ *  be found in the AUTHORS file in the root of the source tree.
+ */
+
+package org.appspot.apprtc;
+
+import org.appspot.apprtc.AppRTCClient.SignalingParameters;
+import org.appspot.apprtc.util.AsyncHttpURLConnection;
+import org.appspot.apprtc.util.AsyncHttpURLConnection.AsyncHttpEvents;
+
+import android.util.Log;
+
+import org.json.JSONArray;
+import org.json.JSONException;
+import org.json.JSONObject;
+import org.webrtc.IceCandidate;
+import org.webrtc.PeerConnection;
+import org.webrtc.SessionDescription;
+
+import java.io.IOException;
+import java.io.InputStream;
+import java.net.HttpURLConnection;
+import java.net.URL;
+import java.util.LinkedList;
+import java.util.Scanner;
+
+/**
+ * AsyncTask that converts an AppRTC room URL into the set of signaling
+ * parameters to use with that room.
+ */
+public class RoomParametersFetcher {
+  private static final String TAG = "RoomRTCClient";
+  private static final int TURN_HTTP_TIMEOUT_MS = 5000;
+  private final RoomParametersFetcherEvents events;
+  private final String roomUrl;
+  private final String roomMessage;
+  private AsyncHttpURLConnection httpConnection;
+
+  /**
+   * Room parameters fetcher callbacks.
+   */
+  public static interface RoomParametersFetcherEvents {
+    /**
+     * Callback fired once the room's signaling parameters
+     * SignalingParameters are extracted.
+     */
+    public void onSignalingParametersReady(final SignalingParameters params);
+
+    /**
+     * Callback for room parameters extraction error.
+     */
+    public void onSignalingParametersError(final String description);
+  }
+
+  public RoomParametersFetcher(String roomUrl, String roomMessage,
+      final RoomParametersFetcherEvents events) {
+    this.roomUrl = roomUrl;
+    this.roomMessage = roomMessage;
+    this.events = events;
+  }
+
+  public void makeRequest() {
+    Log.d(TAG, "Connecting to room: " + roomUrl);
+    httpConnection = new AsyncHttpURLConnection(
+        "POST", roomUrl, roomMessage,
+        new AsyncHttpEvents() {
+          @Override
+          public void onHttpError(String errorMessage) {
+            Log.e(TAG, "Room connection error: " + errorMessage);
+            events.onSignalingParametersError(errorMessage);
+          }
+
+          @Override
+          public void onHttpComplete(String response) {
+            roomHttpResponseParse(response);
+          }
+        });
+    httpConnection.send();
+  }
+
+  private void roomHttpResponseParse(String response) {
+    Log.d(TAG, "Room response: " + response);
+    try {
+      LinkedList<IceCandidate> iceCandidates = null;
+      SessionDescription offerSdp = null;
+      JSONObject roomJson = new JSONObject(response);
+
+      String result = roomJson.getString("result");
+      if (!result.equals("SUCCESS")) {
+        events.onSignalingParametersError("Room response error: " + result);
+        return;
+      }
+      response = roomJson.getString("params");
+      roomJson = new JSONObject(response);
+      String roomId = roomJson.getString("room_id");
+      String clientId = roomJson.getString("client_id");
+      String wssUrl = roomJson.getString("wss_url");
+      String wssPostUrl = roomJson.getString("wss_post_url");
+      boolean initiator = (roomJson.getBoolean("is_initiator"));
+      if (!initiator) {
+        iceCandidates = new LinkedList<IceCandidate>();
+        String messagesString = roomJson.getString("messages");
+        JSONArray messages = new JSONArray(messagesString);
+        for (int i = 0; i < messages.length(); ++i) {
+          String messageString = messages.getString(i);
+          JSONObject message = new JSONObject(messageString);
+          String messageType = message.getString("type");
+          Log.d(TAG, "GAE->C #" + i + " : " + messageString);
+          if (messageType.equals("offer")) {
+            offerSdp = new SessionDescription(
+                SessionDescription.Type.fromCanonicalForm(messageType),
+                message.getString("sdp"));
+          } else if (messageType.equals("candidate")) {
+            IceCandidate candidate = new IceCandidate(
+                message.getString("id"),
+                message.getInt("label"),
+                message.getString("candidate"));
+            iceCandidates.add(candidate);
+          } else {
+            Log.e(TAG, "Unknown message: " + messageString);
+          }
+        }
+      }
+      Log.d(TAG, "RoomId: " + roomId + ". ClientId: " + clientId);
+      Log.d(TAG, "Initiator: " + initiator);
+      Log.d(TAG, "WSS url: " + wssUrl);
+      Log.d(TAG, "WSS POST url: " + wssPostUrl);
+
+      LinkedList<PeerConnection.IceServer> iceServers =
+          iceServersFromPCConfigJSON(roomJson.getString("pc_config"));
+      boolean isTurnPresent = false;
+      for (PeerConnection.IceServer server : iceServers) {
+        Log.d(TAG, "IceServer: " + server);
+        if (server.uri.startsWith("turn:")) {
+          isTurnPresent = true;
+          break;
+        }
+      }
+      // Request TURN servers.
+      if (!isTurnPresent) {
+        LinkedList<PeerConnection.IceServer> turnServers =
+            requestTurnServers(roomJson.getString("turn_url"));
+        for (PeerConnection.IceServer turnServer : turnServers) {
+          Log.d(TAG, "TurnServer: " + turnServer);
+          iceServers.add(turnServer);
+        }
+      }
+
+      SignalingParameters params = new SignalingParameters(
+          iceServers, initiator,
+          clientId, wssUrl, wssPostUrl,
+          offerSdp, iceCandidates);
+      events.onSignalingParametersReady(params);
+    } catch (JSONException e) {
+      events.onSignalingParametersError(
+          "Room JSON parsing error: " + e.toString());
+    } catch (IOException e) {
+      events.onSignalingParametersError("Room IO error: " + e.toString());
+    }
+  }
+
+  // Requests & returns a TURN ICE Server based on a request URL.  Must be run
+  // off the main thread!
+  private LinkedList<PeerConnection.IceServer> requestTurnServers(String url)
+      throws IOException, JSONException {
+    LinkedList<PeerConnection.IceServer> turnServers =
+        new LinkedList<PeerConnection.IceServer>();
+    Log.d(TAG, "Request TURN from: " + url);
+    HttpURLConnection connection = (HttpURLConnection) new URL(url).openConnection();
+    connection.setConnectTimeout(TURN_HTTP_TIMEOUT_MS);
+    connection.setReadTimeout(TURN_HTTP_TIMEOUT_MS);
+    int responseCode = connection.getResponseCode();
+    if (responseCode != 200) {
+      throw new IOException("Non-200 response when requesting TURN server from "
+          + url + " : " + connection.getHeaderField(null));
+    }
+    InputStream responseStream = connection.getInputStream();
+    String response = drainStream(responseStream);
+    connection.disconnect();
+    Log.d(TAG, "TURN response: " + response);
+    JSONObject responseJSON = new JSONObject(response);
+    String username = responseJSON.getString("username");
+    String password = responseJSON.getString("password");
+    JSONArray turnUris = responseJSON.getJSONArray("uris");
+    for (int i = 0; i < turnUris.length(); i++) {
+      String uri = turnUris.getString(i);
+      turnServers.add(new PeerConnection.IceServer(uri, username, password));
+    }
+    return turnServers;
+  }
+
+  // Return the list of ICE servers described by a WebRTCPeerConnection
+  // configuration string.
+  private LinkedList<PeerConnection.IceServer> iceServersFromPCConfigJSON(
+      String pcConfig) throws JSONException {
+    JSONObject json = new JSONObject(pcConfig);
+    JSONArray servers = json.getJSONArray("iceServers");
+    LinkedList<PeerConnection.IceServer> ret =
+        new LinkedList<PeerConnection.IceServer>();
+    for (int i = 0; i < servers.length(); ++i) {
+      JSONObject server = servers.getJSONObject(i);
+      String url = server.getString("urls");
+      String credential =
+          server.has("credential") ? server.getString("credential") : "";
+      ret.add(new PeerConnection.IceServer(url, "", credential));
+    }
+    return ret;
+  }
+
+  // Return the contents of an InputStream as a String.
+  private static String drainStream(InputStream in) {
+    Scanner s = new Scanner(in).useDelimiter("\\A");
+    return s.hasNext() ? s.next() : "";
+  }
+
+}
diff --git a/examples/androidapp/src/org/appspot/apprtc/SettingsActivity.java b/examples/androidapp/src/org/appspot/apprtc/SettingsActivity.java
new file mode 100644
index 0000000..ce7d989
--- /dev/null
+++ b/examples/androidapp/src/org/appspot/apprtc/SettingsActivity.java
@@ -0,0 +1,177 @@
+/*
+ *  Copyright 2014 The WebRTC Project Authors. All rights reserved.
+ *
+ *  Use of this source code is governed by a BSD-style license
+ *  that can be found in the LICENSE file in the root of the source
+ *  tree. An additional intellectual property rights grant can be found
+ *  in the file PATENTS.  All contributing project authors may
+ *  be found in the AUTHORS file in the root of the source tree.
+ */
+
+package org.appspot.apprtc;
+
+import android.app.Activity;
+import android.content.SharedPreferences;
+import android.content.SharedPreferences.OnSharedPreferenceChangeListener;
+import android.os.Bundle;
+import android.preference.Preference;
+
+/**
+ * Settings activity for AppRTC.
+ */
+public class SettingsActivity extends Activity
+    implements OnSharedPreferenceChangeListener{
+  private SettingsFragment settingsFragment;
+  private String keyprefVideoCall;
+  private String keyprefResolution;
+  private String keyprefFps;
+  private String keyprefStartVideoBitrateType;
+  private String keyprefStartVideoBitrateValue;
+  private String keyPrefVideoCodec;
+  private String keyprefHwCodec;
+
+  private String keyprefStartAudioBitrateType;
+  private String keyprefStartAudioBitrateValue;
+  private String keyPrefAudioCodec;
+  private String keyprefNoAudioProcessing;
+
+  private String keyprefCpuUsageDetection;
+  private String keyPrefRoomServerUrl;
+  private String keyPrefDisplayHud;
+
+  @Override
+  protected void onCreate(Bundle savedInstanceState) {
+    super.onCreate(savedInstanceState);
+    keyprefVideoCall = getString(R.string.pref_videocall_key);
+    keyprefResolution = getString(R.string.pref_resolution_key);
+    keyprefFps = getString(R.string.pref_fps_key);
+    keyprefStartVideoBitrateType = getString(R.string.pref_startvideobitrate_key);
+    keyprefStartVideoBitrateValue = getString(R.string.pref_startvideobitratevalue_key);
+    keyPrefVideoCodec = getString(R.string.pref_videocodec_key);
+    keyprefHwCodec = getString(R.string.pref_hwcodec_key);
+
+    keyprefStartAudioBitrateType = getString(R.string.pref_startaudiobitrate_key);
+    keyprefStartAudioBitrateValue = getString(R.string.pref_startaudiobitratevalue_key);
+    keyPrefAudioCodec = getString(R.string.pref_audiocodec_key);
+    keyprefNoAudioProcessing = getString(R.string.pref_noaudioprocessing_key);
+
+    keyprefCpuUsageDetection = getString(R.string.pref_cpu_usage_detection_key);
+    keyPrefRoomServerUrl = getString(R.string.pref_room_server_url_key);
+    keyPrefDisplayHud = getString(R.string.pref_displayhud_key);
+
+    // Display the fragment as the main content.
+    settingsFragment = new SettingsFragment();
+    getFragmentManager().beginTransaction()
+        .replace(android.R.id.content, settingsFragment)
+        .commit();
+  }
+
+  @Override
+  protected void onResume() {
+    super.onResume();
+    // Set summary to be the user-description for the selected value
+    SharedPreferences sharedPreferences =
+        settingsFragment.getPreferenceScreen().getSharedPreferences();
+    sharedPreferences.registerOnSharedPreferenceChangeListener(this);
+    updateSummaryB(sharedPreferences, keyprefVideoCall);
+    updateSummary(sharedPreferences, keyprefResolution);
+    updateSummary(sharedPreferences, keyprefFps);
+    updateSummary(sharedPreferences, keyprefStartVideoBitrateType);
+    updateSummaryBitrate(sharedPreferences, keyprefStartVideoBitrateValue);
+    setVideoBitrateEnable(sharedPreferences);
+    updateSummary(sharedPreferences, keyPrefVideoCodec);
+    updateSummaryB(sharedPreferences, keyprefHwCodec);
+
+    updateSummary(sharedPreferences, keyprefStartAudioBitrateType);
+    updateSummaryBitrate(sharedPreferences, keyprefStartAudioBitrateValue);
+    setAudioBitrateEnable(sharedPreferences);
+    updateSummary(sharedPreferences, keyPrefAudioCodec);
+    updateSummaryB(sharedPreferences, keyprefNoAudioProcessing);
+
+    updateSummaryB(sharedPreferences, keyprefCpuUsageDetection);
+    updateSummary(sharedPreferences, keyPrefRoomServerUrl);
+    updateSummaryB(sharedPreferences, keyPrefDisplayHud);
+  }
+
+  @Override
+  protected void onPause() {
+    super.onPause();
+    SharedPreferences sharedPreferences =
+        settingsFragment.getPreferenceScreen().getSharedPreferences();
+    sharedPreferences.unregisterOnSharedPreferenceChangeListener(this);
+  }
+
+  @Override
+  public void onSharedPreferenceChanged(SharedPreferences sharedPreferences,
+      String key) {
+    if (key.equals(keyprefResolution)
+        || key.equals(keyprefFps)
+        || key.equals(keyprefStartVideoBitrateType)
+        || key.equals(keyPrefVideoCodec)
+        || key.equals(keyprefStartAudioBitrateType)
+        || key.equals(keyPrefAudioCodec)
+        || key.equals(keyPrefRoomServerUrl)) {
+      updateSummary(sharedPreferences, key);
+    } else if (key.equals(keyprefStartVideoBitrateValue)
+        || key.equals(keyprefStartAudioBitrateValue)) {
+      updateSummaryBitrate(sharedPreferences, key);
+    } else if (key.equals(keyprefVideoCall)
+        || key.equals(keyprefHwCodec)
+        || key.equals(keyprefNoAudioProcessing)
+        || key.equals(keyprefCpuUsageDetection)
+        || key.equals(keyPrefDisplayHud)) {
+      updateSummaryB(sharedPreferences, key);
+    }
+    if (key.equals(keyprefStartVideoBitrateType)) {
+      setVideoBitrateEnable(sharedPreferences);
+    }
+    if (key.equals(keyprefStartAudioBitrateType)) {
+      setAudioBitrateEnable(sharedPreferences);
+    }
+  }
+
+  private void updateSummary(SharedPreferences sharedPreferences, String key) {
+    Preference updatedPref = settingsFragment.findPreference(key);
+    // Set summary to be the user-description for the selected value
+    updatedPref.setSummary(sharedPreferences.getString(key, ""));
+  }
+
+  private void updateSummaryBitrate(
+      SharedPreferences sharedPreferences, String key) {
+    Preference updatedPref = settingsFragment.findPreference(key);
+    updatedPref.setSummary(sharedPreferences.getString(key, "") + " kbps");
+  }
+
+  private void updateSummaryB(SharedPreferences sharedPreferences, String key) {
+    Preference updatedPref = settingsFragment.findPreference(key);
+    updatedPref.setSummary(sharedPreferences.getBoolean(key, true)
+        ? getString(R.string.pref_value_enabled)
+        : getString(R.string.pref_value_disabled));
+  }
+
+  private void setVideoBitrateEnable(SharedPreferences sharedPreferences) {
+    Preference bitratePreferenceValue =
+        settingsFragment.findPreference(keyprefStartVideoBitrateValue);
+    String bitrateTypeDefault = getString(R.string.pref_startvideobitrate_default);
+    String bitrateType = sharedPreferences.getString(
+        keyprefStartVideoBitrateType, bitrateTypeDefault);
+    if (bitrateType.equals(bitrateTypeDefault)) {
+      bitratePreferenceValue.setEnabled(false);
+    } else {
+      bitratePreferenceValue.setEnabled(true);
+    }
+  }
+
+  private void setAudioBitrateEnable(SharedPreferences sharedPreferences) {
+    Preference bitratePreferenceValue =
+        settingsFragment.findPreference(keyprefStartAudioBitrateValue);
+    String bitrateTypeDefault = getString(R.string.pref_startaudiobitrate_default);
+    String bitrateType = sharedPreferences.getString(
+        keyprefStartAudioBitrateType, bitrateTypeDefault);
+    if (bitrateType.equals(bitrateTypeDefault)) {
+      bitratePreferenceValue.setEnabled(false);
+    } else {
+      bitratePreferenceValue.setEnabled(true);
+    }
+  }
+}
diff --git a/examples/androidapp/src/org/appspot/apprtc/SettingsFragment.java b/examples/androidapp/src/org/appspot/apprtc/SettingsFragment.java
new file mode 100644
index 0000000..3fc5b51
--- /dev/null
+++ b/examples/androidapp/src/org/appspot/apprtc/SettingsFragment.java
@@ -0,0 +1,27 @@
+/*
+ *  Copyright 2014 The WebRTC Project Authors. All rights reserved.
+ *
+ *  Use of this source code is governed by a BSD-style license
+ *  that can be found in the LICENSE file in the root of the source
+ *  tree. An additional intellectual property rights grant can be found
+ *  in the file PATENTS.  All contributing project authors may
+ *  be found in the AUTHORS file in the root of the source tree.
+ */
+
+package org.appspot.apprtc;
+
+import android.os.Bundle;
+import android.preference.PreferenceFragment;
+
+/**
+ * Settings fragment for AppRTC.
+ */
+public class SettingsFragment extends PreferenceFragment {
+
+  @Override
+  public void onCreate(Bundle savedInstanceState) {
+    super.onCreate(savedInstanceState);
+    // Load the preferences from an XML resource
+    addPreferencesFromResource(R.xml.preferences);
+  }
+}
diff --git a/examples/androidapp/src/org/appspot/apprtc/UnhandledExceptionHandler.java b/examples/androidapp/src/org/appspot/apprtc/UnhandledExceptionHandler.java
new file mode 100644
index 0000000..a9a136b
--- /dev/null
+++ b/examples/androidapp/src/org/appspot/apprtc/UnhandledExceptionHandler.java
@@ -0,0 +1,86 @@
+/*
+ *  Copyright 2013 The WebRTC Project Authors. All rights reserved.
+ *
+ *  Use of this source code is governed by a BSD-style license
+ *  that can be found in the LICENSE file in the root of the source
+ *  tree. An additional intellectual property rights grant can be found
+ *  in the file PATENTS.  All contributing project authors may
+ *  be found in the AUTHORS file in the root of the source tree.
+ */
+
+package org.appspot.apprtc;
+
+import android.app.Activity;
+import android.app.AlertDialog;
+import android.content.DialogInterface;
+import android.util.Log;
+import android.util.TypedValue;
+import android.widget.ScrollView;
+import android.widget.TextView;
+
+import java.io.PrintWriter;
+import java.io.StringWriter;
+
+/**
+ * Singleton helper: install a default unhandled exception handler which shows
+ * an informative dialog and kills the app.  Useful for apps whose
+ * error-handling consists of throwing RuntimeExceptions.
+ * NOTE: almost always more useful to
+ * Thread.setDefaultUncaughtExceptionHandler() rather than
+ * Thread.setUncaughtExceptionHandler(), to apply to background threads as well.
+ */
+public class UnhandledExceptionHandler
+    implements Thread.UncaughtExceptionHandler {
+  private static final String TAG = "AppRTCDemoActivity";
+  private final Activity activity;
+
+  public UnhandledExceptionHandler(final Activity activity) {
+    this.activity = activity;
+  }
+
+  public void uncaughtException(Thread unusedThread, final Throwable e) {
+    activity.runOnUiThread(new Runnable() {
+        @Override public void run() {
+          String title = "Fatal error: " + getTopLevelCauseMessage(e);
+          String msg = getRecursiveStackTrace(e);
+          TextView errorView = new TextView(activity);
+          errorView.setText(msg);
+          errorView.setTextSize(TypedValue.COMPLEX_UNIT_SP, 8);
+          ScrollView scrollingContainer = new ScrollView(activity);
+          scrollingContainer.addView(errorView);
+          Log.e(TAG, title + "\n\n" + msg);
+          DialogInterface.OnClickListener listener =
+              new DialogInterface.OnClickListener() {
+                @Override public void onClick(
+                    DialogInterface dialog, int which) {
+                  dialog.dismiss();
+                  System.exit(1);
+                }
+              };
+          AlertDialog.Builder builder =
+              new AlertDialog.Builder(activity);
+          builder
+              .setTitle(title)
+              .setView(scrollingContainer)
+              .setPositiveButton("Exit", listener).show();
+        }
+      });
+  }
+
+  // Returns the Message attached to the original Cause of |t|.
+  private static String getTopLevelCauseMessage(Throwable t) {
+    Throwable topLevelCause = t;
+    while (topLevelCause.getCause() != null) {
+      topLevelCause = topLevelCause.getCause();
+    }
+    return topLevelCause.getMessage();
+  }
+
+  // Returns a human-readable String of the stacktrace in |t|, recursively
+  // through all Causes that led to |t|.
+  private static String getRecursiveStackTrace(Throwable t) {
+    StringWriter writer = new StringWriter();
+    t.printStackTrace(new PrintWriter(writer));
+    return writer.toString();
+  }
+}
diff --git a/examples/androidapp/src/org/appspot/apprtc/WebSocketChannelClient.java b/examples/androidapp/src/org/appspot/apprtc/WebSocketChannelClient.java
new file mode 100644
index 0000000..14b7231
--- /dev/null
+++ b/examples/androidapp/src/org/appspot/apprtc/WebSocketChannelClient.java
@@ -0,0 +1,305 @@
+/*
+ *  Copyright 2014 The WebRTC Project Authors. All rights reserved.
+ *
+ *  Use of this source code is governed by a BSD-style license
+ *  that can be found in the LICENSE file in the root of the source
+ *  tree. An additional intellectual property rights grant can be found
+ *  in the file PATENTS.  All contributing project authors may
+ *  be found in the AUTHORS file in the root of the source tree.
+ */
+
+package org.appspot.apprtc;
+
+import org.appspot.apprtc.util.AsyncHttpURLConnection;
+import org.appspot.apprtc.util.AsyncHttpURLConnection.AsyncHttpEvents;
+import org.appspot.apprtc.util.LooperExecutor;
+
+import android.util.Log;
+
+import de.tavendo.autobahn.WebSocket.WebSocketConnectionObserver;
+import de.tavendo.autobahn.WebSocketConnection;
+import de.tavendo.autobahn.WebSocketException;
+
+import org.json.JSONException;
+import org.json.JSONObject;
+
+import java.net.URI;
+import java.net.URISyntaxException;
+import java.util.LinkedList;
+
+/**
+ * WebSocket client implementation.
+ *
+ * <p>All public methods should be called from a looper executor thread
+ * passed in a constructor, otherwise exception will be thrown.
+ * All events are dispatched on the same thread.
+ */
+
+public class WebSocketChannelClient {
+  private static final String TAG = "WSChannelRTCClient";
+  private static final int CLOSE_TIMEOUT = 1000;
+  private final WebSocketChannelEvents events;
+  private final LooperExecutor executor;
+  private WebSocketConnection ws;
+  private WebSocketObserver wsObserver;
+  private String wsServerUrl;
+  private String postServerUrl;
+  private String roomID;
+  private String clientID;
+  private WebSocketConnectionState state;
+  private final Object closeEventLock = new Object();
+  private boolean closeEvent;
+  // WebSocket send queue. Messages are added to the queue when WebSocket
+  // client is not registered and are consumed in register() call.
+  private final LinkedList<String> wsSendQueue;
+
+  /**
+   * Possible WebSocket connection states.
+   */
+  public enum WebSocketConnectionState {
+    NEW, CONNECTED, REGISTERED, CLOSED, ERROR
+  };
+
+  /**
+   * Callback interface for messages delivered on WebSocket.
+   * All events are dispatched from a looper executor thread.
+   */
+  public interface WebSocketChannelEvents {
+    public void onWebSocketMessage(final String message);
+    public void onWebSocketClose();
+    public void onWebSocketError(final String description);
+  }
+
+  public WebSocketChannelClient(LooperExecutor executor, WebSocketChannelEvents events) {
+    this.executor = executor;
+    this.events = events;
+    roomID = null;
+    clientID = null;
+    wsSendQueue = new LinkedList<String>();
+    state = WebSocketConnectionState.NEW;
+  }
+
+  public WebSocketConnectionState getState() {
+    return state;
+  }
+
+  public void connect(final String wsUrl, final String postUrl) {
+    checkIfCalledOnValidThread();
+    if (state != WebSocketConnectionState.NEW) {
+      Log.e(TAG, "WebSocket is already connected.");
+      return;
+    }
+    wsServerUrl = wsUrl;
+    postServerUrl = postUrl;
+    closeEvent = false;
+
+    Log.d(TAG, "Connecting WebSocket to: " + wsUrl + ". Post URL: " + postUrl);
+    ws = new WebSocketConnection();
+    wsObserver = new WebSocketObserver();
+    try {
+      ws.connect(new URI(wsServerUrl), wsObserver);
+    } catch (URISyntaxException e) {
+      reportError("URI error: " + e.getMessage());
+    } catch (WebSocketException e) {
+      reportError("WebSocket connection error: " + e.getMessage());
+    }
+  }
+
+  public void register(final String roomID, final String clientID) {
+    checkIfCalledOnValidThread();
+    this.roomID = roomID;
+    this.clientID = clientID;
+    if (state != WebSocketConnectionState.CONNECTED) {
+      Log.w(TAG, "WebSocket register() in state " + state);
+      return;
+    }
+    Log.d(TAG, "Registering WebSocket for room " + roomID + ". CLientID: " + clientID);
+    JSONObject json = new JSONObject();
+    try {
+      json.put("cmd", "register");
+      json.put("roomid", roomID);
+      json.put("clientid", clientID);
+      Log.d(TAG, "C->WSS: " + json.toString());
+      ws.sendTextMessage(json.toString());
+      state = WebSocketConnectionState.REGISTERED;
+      // Send any previously accumulated messages.
+      for (String sendMessage : wsSendQueue) {
+        send(sendMessage);
+      }
+      wsSendQueue.clear();
+    } catch (JSONException e) {
+      reportError("WebSocket register JSON error: " + e.getMessage());
+    }
+  }
+
+  public void send(String message) {
+    checkIfCalledOnValidThread();
+    switch (state) {
+      case NEW:
+      case CONNECTED:
+        // Store outgoing messages and send them after websocket client
+        // is registered.
+        Log.d(TAG, "WS ACC: " + message);
+        wsSendQueue.add(message);
+        return;
+      case ERROR:
+      case CLOSED:
+        Log.e(TAG, "WebSocket send() in error or closed state : " + message);
+        return;
+      case REGISTERED:
+        JSONObject json = new JSONObject();
+        try {
+          json.put("cmd", "send");
+          json.put("msg", message);
+          message = json.toString();
+          Log.d(TAG, "C->WSS: " + message);
+          ws.sendTextMessage(message);
+        } catch (JSONException e) {
+          reportError("WebSocket send JSON error: " + e.getMessage());
+        }
+        break;
+    }
+    return;
+  }
+
+  // This call can be used to send WebSocket messages before WebSocket
+  // connection is opened.
+  public void post(String message) {
+    checkIfCalledOnValidThread();
+    sendWSSMessage("POST", message);
+  }
+
+  public void disconnect(boolean waitForComplete) {
+    checkIfCalledOnValidThread();
+    Log.d(TAG, "Disonnect WebSocket. State: " + state);
+    if (state == WebSocketConnectionState.REGISTERED) {
+      // Send "bye" to WebSocket server.
+      send("{\"type\": \"bye\"}");
+      state = WebSocketConnectionState.CONNECTED;
+      // Send http DELETE to http WebSocket server.
+      sendWSSMessage("DELETE", "");
+    }
+    // Close WebSocket in CONNECTED or ERROR states only.
+    if (state == WebSocketConnectionState.CONNECTED
+        || state == WebSocketConnectionState.ERROR) {
+      ws.disconnect();
+      state = WebSocketConnectionState.CLOSED;
+
+      // Wait for websocket close event to prevent websocket library from
+      // sending any pending messages to deleted looper thread.
+      if (waitForComplete) {
+        synchronized (closeEventLock) {
+          while (!closeEvent) {
+            try {
+              closeEventLock.wait(CLOSE_TIMEOUT);
+              break;
+            } catch (InterruptedException e) {
+              Log.e(TAG, "Wait error: " + e.toString());
+            }
+          }
+        }
+      }
+    }
+    Log.d(TAG, "Disonnecting WebSocket done.");
+  }
+
+  private void reportError(final String errorMessage) {
+    Log.e(TAG, errorMessage);
+    executor.execute(new Runnable() {
+      @Override
+      public void run() {
+        if (state != WebSocketConnectionState.ERROR) {
+          state = WebSocketConnectionState.ERROR;
+          events.onWebSocketError(errorMessage);
+        }
+      }
+    });
+  }
+
+  // Asynchronously send POST/DELETE to WebSocket server.
+  private void sendWSSMessage(final String method, final String message) {
+    String postUrl = postServerUrl + "/" + roomID + "/" + clientID;
+    Log.d(TAG, "WS " + method + " : " + postUrl + " : " + message);
+    AsyncHttpURLConnection httpConnection = new AsyncHttpURLConnection(
+        method, postUrl, message, new AsyncHttpEvents() {
+          @Override
+          public void onHttpError(String errorMessage) {
+            reportError("WS " + method + " error: " + errorMessage);
+          }
+
+          @Override
+          public void onHttpComplete(String response) {
+          }
+        });
+    httpConnection.send();
+  }
+
+   // Helper method for debugging purposes. Ensures that WebSocket method is
+   // called on a looper thread.
+  private void checkIfCalledOnValidThread() {
+    if (!executor.checkOnLooperThread()) {
+      throw new IllegalStateException(
+          "WebSocket method is not called on valid thread");
+    }
+  }
+
+  private class WebSocketObserver implements WebSocketConnectionObserver {
+    @Override
+    public void onOpen() {
+      Log.d(TAG, "WebSocket connection opened to: " + wsServerUrl);
+      executor.execute(new Runnable() {
+        @Override
+        public void run() {
+          state = WebSocketConnectionState.CONNECTED;
+          // Check if we have pending register request.
+          if (roomID != null && clientID != null) {
+            register(roomID, clientID);
+          }
+        }
+      });
+    }
+
+    @Override
+    public void onClose(WebSocketCloseNotification code, String reason) {
+      Log.d(TAG, "WebSocket connection closed. Code: " + code
+          + ". Reason: " + reason + ". State: " + state);
+      synchronized (closeEventLock) {
+        closeEvent = true;
+        closeEventLock.notify();
+      }
+      executor.execute(new Runnable() {
+        @Override
+        public void run() {
+          if (state != WebSocketConnectionState.CLOSED) {
+            state = WebSocketConnectionState.CLOSED;
+            events.onWebSocketClose();
+          }
+        }
+      });
+    }
+
+    @Override
+    public void onTextMessage(String payload) {
+      Log.d(TAG, "WSS->C: " + payload);
+      final String message = payload;
+      executor.execute(new Runnable() {
+        @Override
+        public void run() {
+          if (state == WebSocketConnectionState.CONNECTED
+              || state == WebSocketConnectionState.REGISTERED) {
+            events.onWebSocketMessage(message);
+          }
+        }
+      });
+    }
+
+    @Override
+    public void onRawTextMessage(byte[] payload) {
+    }
+
+    @Override
+    public void onBinaryMessage(byte[] payload) {
+    }
+  }
+
+}
diff --git a/examples/androidapp/src/org/appspot/apprtc/WebSocketRTCClient.java b/examples/androidapp/src/org/appspot/apprtc/WebSocketRTCClient.java
new file mode 100644
index 0000000..ca319ab
--- /dev/null
+++ b/examples/androidapp/src/org/appspot/apprtc/WebSocketRTCClient.java
@@ -0,0 +1,379 @@
+/*
+ *  Copyright 2014 The WebRTC Project Authors. All rights reserved.
+ *
+ *  Use of this source code is governed by a BSD-style license
+ *  that can be found in the LICENSE file in the root of the source
+ *  tree. An additional intellectual property rights grant can be found
+ *  in the file PATENTS.  All contributing project authors may
+ *  be found in the AUTHORS file in the root of the source tree.
+ */
+
+package org.appspot.apprtc;
+
+import org.appspot.apprtc.RoomParametersFetcher.RoomParametersFetcherEvents;
+import org.appspot.apprtc.WebSocketChannelClient.WebSocketChannelEvents;
+import org.appspot.apprtc.WebSocketChannelClient.WebSocketConnectionState;
+import org.appspot.apprtc.util.AsyncHttpURLConnection;
+import org.appspot.apprtc.util.AsyncHttpURLConnection.AsyncHttpEvents;
+import org.appspot.apprtc.util.LooperExecutor;
+
+import android.util.Log;
+
+import org.json.JSONException;
+import org.json.JSONObject;
+import org.webrtc.IceCandidate;
+import org.webrtc.SessionDescription;
+
+/**
+ * Negotiates signaling for chatting with apprtc.appspot.com "rooms".
+ * Uses the client<->server specifics of the apprtc AppEngine webapp.
+ *
+ * <p>To use: create an instance of this object (registering a message handler) and
+ * call connectToRoom().  Once room connection is established
+ * onConnectedToRoom() callback with room parameters is invoked.
+ * Messages to other party (with local Ice candidates and answer SDP) can
+ * be sent after WebSocket connection is established.
+ */
+public class WebSocketRTCClient implements AppRTCClient,
+    WebSocketChannelEvents {
+  private static final String TAG = "WSRTCClient";
+  private static final String ROOM_JOIN = "join";
+  private static final String ROOM_MESSAGE = "message";
+  private static final String ROOM_LEAVE = "leave";
+
+  private enum ConnectionState {
+    NEW, CONNECTED, CLOSED, ERROR
+  };
+  private enum MessageType {
+    MESSAGE, LEAVE
+  };
+  private final LooperExecutor executor;
+  private boolean initiator;
+  private SignalingEvents events;
+  private WebSocketChannelClient wsClient;
+  private ConnectionState roomState;
+  private RoomConnectionParameters connectionParameters;
+  private String messageUrl;
+  private String leaveUrl;
+
+  public WebSocketRTCClient(SignalingEvents events, LooperExecutor executor) {
+    this.events = events;
+    this.executor = executor;
+    roomState = ConnectionState.NEW;
+    executor.requestStart();
+  }
+
+  // --------------------------------------------------------------------
+  // AppRTCClient interface implementation.
+  // Asynchronously connect to an AppRTC room URL using supplied connection
+  // parameters, retrieves room parameters and connect to WebSocket server.
+  @Override
+  public void connectToRoom(RoomConnectionParameters connectionParameters) {
+    this.connectionParameters = connectionParameters;
+    executor.execute(new Runnable() {
+      @Override
+      public void run() {
+        connectToRoomInternal();
+      }
+    });
+  }
+
+  @Override
+  public void disconnectFromRoom() {
+    executor.execute(new Runnable() {
+      @Override
+      public void run() {
+        disconnectFromRoomInternal();
+      }
+    });
+    executor.requestStop();
+  }
+
+  // Connects to room - function runs on a local looper thread.
+  private void connectToRoomInternal() {
+    String connectionUrl = getConnectionUrl(connectionParameters);
+    Log.d(TAG, "Connect to room: " + connectionUrl);
+    roomState = ConnectionState.NEW;
+    wsClient = new WebSocketChannelClient(executor, this);
+
+    RoomParametersFetcherEvents callbacks = new RoomParametersFetcherEvents() {
+      @Override
+      public void onSignalingParametersReady(
+          final SignalingParameters params) {
+        WebSocketRTCClient.this.executor.execute(new Runnable() {
+          @Override
+          public void run() {
+            WebSocketRTCClient.this.signalingParametersReady(params);
+          }
+        });
+      }
+
+      @Override
+      public void onSignalingParametersError(String description) {
+        WebSocketRTCClient.this.reportError(description);
+      }
+    };
+
+    new RoomParametersFetcher(connectionUrl, null, callbacks).makeRequest();
+  }
+
+  // Disconnect from room and send bye messages - runs on a local looper thread.
+  private void disconnectFromRoomInternal() {
+    Log.d(TAG, "Disconnect. Room state: " + roomState);
+    if (roomState == ConnectionState.CONNECTED) {
+      Log.d(TAG, "Closing room.");
+      sendPostMessage(MessageType.LEAVE, leaveUrl, null);
+    }
+    roomState = ConnectionState.CLOSED;
+    if (wsClient != null) {
+      wsClient.disconnect(true);
+    }
+  }
+
+  // Helper functions to get connection, post message and leave message URLs
+  private String getConnectionUrl(
+      RoomConnectionParameters connectionParameters) {
+    return connectionParameters.roomUrl + "/" + ROOM_JOIN + "/"
+        + connectionParameters.roomId;
+  }
+
+  private String getMessageUrl(RoomConnectionParameters connectionParameters,
+      SignalingParameters signalingParameters) {
+    return connectionParameters.roomUrl + "/" + ROOM_MESSAGE + "/"
+      + connectionParameters.roomId + "/" + signalingParameters.clientId;
+  }
+
+  private String getLeaveUrl(RoomConnectionParameters connectionParameters,
+      SignalingParameters signalingParameters) {
+    return connectionParameters.roomUrl + "/" + ROOM_LEAVE + "/"
+        + connectionParameters.roomId + "/" + signalingParameters.clientId;
+  }
+
+  // Callback issued when room parameters are extracted. Runs on local
+  // looper thread.
+  private void signalingParametersReady(
+      final SignalingParameters signalingParameters) {
+    Log.d(TAG, "Room connection completed.");
+    if (connectionParameters.loopback
+        && (!signalingParameters.initiator
+            || signalingParameters.offerSdp != null)) {
+      reportError("Loopback room is busy.");
+      return;
+    }
+    if (!connectionParameters.loopback
+        && !signalingParameters.initiator
+        && signalingParameters.offerSdp == null) {
+      Log.w(TAG, "No offer SDP in room response.");
+    }
+    initiator = signalingParameters.initiator;
+    messageUrl = getMessageUrl(connectionParameters, signalingParameters);
+    leaveUrl = getLeaveUrl(connectionParameters, signalingParameters);
+    Log.d(TAG, "Message URL: " + messageUrl);
+    Log.d(TAG, "Leave URL: " + leaveUrl);
+    roomState = ConnectionState.CONNECTED;
+
+    // Fire connection and signaling parameters events.
+    events.onConnectedToRoom(signalingParameters);
+
+    // Connect and register WebSocket client.
+    wsClient.connect(signalingParameters.wssUrl, signalingParameters.wssPostUrl);
+    wsClient.register(connectionParameters.roomId, signalingParameters.clientId);
+  }
+
+  // Send local offer SDP to the other participant.
+  @Override
+  public void sendOfferSdp(final SessionDescription sdp) {
+    executor.execute(new Runnable() {
+      @Override
+      public void run() {
+        if (roomState != ConnectionState.CONNECTED) {
+          reportError("Sending offer SDP in non connected state.");
+          return;
+        }
+        JSONObject json = new JSONObject();
+        jsonPut(json, "sdp", sdp.description);
+        jsonPut(json, "type", "offer");
+        sendPostMessage(MessageType.MESSAGE, messageUrl, json.toString());
+        if (connectionParameters.loopback) {
+          // In loopback mode rename this offer to answer and route it back.
+          SessionDescription sdpAnswer = new SessionDescription(
+              SessionDescription.Type.fromCanonicalForm("answer"),
+              sdp.description);
+          events.onRemoteDescription(sdpAnswer);
+        }
+      }
+    });
+  }
+
+  // Send local answer SDP to the other participant.
+  @Override
+  public void sendAnswerSdp(final SessionDescription sdp) {
+    executor.execute(new Runnable() {
+      @Override
+      public void run() {
+        if (connectionParameters.loopback) {
+          Log.e(TAG, "Sending answer in loopback mode.");
+          return;
+        }
+        JSONObject json = new JSONObject();
+        jsonPut(json, "sdp", sdp.description);
+        jsonPut(json, "type", "answer");
+        wsClient.send(json.toString());
+      }
+    });
+  }
+
+  // Send Ice candidate to the other participant.
+  @Override
+  public void sendLocalIceCandidate(final IceCandidate candidate) {
+    executor.execute(new Runnable() {
+      @Override
+      public void run() {
+        JSONObject json = new JSONObject();
+        jsonPut(json, "type", "candidate");
+        jsonPut(json, "label", candidate.sdpMLineIndex);
+        jsonPut(json, "id", candidate.sdpMid);
+        jsonPut(json, "candidate", candidate.sdp);
+        if (initiator) {
+          // Call initiator sends ice candidates to GAE server.
+          if (roomState != ConnectionState.CONNECTED) {
+            reportError("Sending ICE candidate in non connected state.");
+            return;
+          }
+          sendPostMessage(MessageType.MESSAGE, messageUrl, json.toString());
+          if (connectionParameters.loopback) {
+            events.onRemoteIceCandidate(candidate);
+          }
+        } else {
+          // Call receiver sends ice candidates to websocket server.
+          wsClient.send(json.toString());
+        }
+      }
+    });
+  }
+
+  // --------------------------------------------------------------------
+  // WebSocketChannelEvents interface implementation.
+  // All events are called by WebSocketChannelClient on a local looper thread
+  // (passed to WebSocket client constructor).
+  @Override
+  public void onWebSocketMessage(final String msg) {
+    if (wsClient.getState() != WebSocketConnectionState.REGISTERED) {
+      Log.e(TAG, "Got WebSocket message in non registered state.");
+      return;
+    }
+    try {
+      JSONObject json = new JSONObject(msg);
+      String msgText = json.getString("msg");
+      String errorText = json.optString("error");
+      if (msgText.length() > 0) {
+        json = new JSONObject(msgText);
+        String type = json.optString("type");
+        if (type.equals("candidate")) {
+          IceCandidate candidate = new IceCandidate(
+              json.getString("id"),
+              json.getInt("label"),
+              json.getString("candidate"));
+          events.onRemoteIceCandidate(candidate);
+        } else if (type.equals("answer")) {
+          if (initiator) {
+            SessionDescription sdp = new SessionDescription(
+                SessionDescription.Type.fromCanonicalForm(type),
+                json.getString("sdp"));
+            events.onRemoteDescription(sdp);
+          } else {
+            reportError("Received answer for call initiator: " + msg);
+          }
+        } else if (type.equals("offer")) {
+          if (!initiator) {
+            SessionDescription sdp = new SessionDescription(
+                SessionDescription.Type.fromCanonicalForm(type),
+                json.getString("sdp"));
+            events.onRemoteDescription(sdp);
+          } else {
+            reportError("Received offer for call receiver: " + msg);
+          }
+        } else if (type.equals("bye")) {
+          events.onChannelClose();
+        } else {
+          reportError("Unexpected WebSocket message: " + msg);
+        }
+      } else {
+        if (errorText != null && errorText.length() > 0) {
+          reportError("WebSocket error message: " + errorText);
+        } else {
+          reportError("Unexpected WebSocket message: " + msg);
+        }
+      }
+    } catch (JSONException e) {
+      reportError("WebSocket message JSON parsing error: " + e.toString());
+    }
+  }
+
+  @Override
+  public void onWebSocketClose() {
+    events.onChannelClose();
+  }
+
+  @Override
+  public void onWebSocketError(String description) {
+    reportError("WebSocket error: " + description);
+  }
+
+  // --------------------------------------------------------------------
+  // Helper functions.
+  private void reportError(final String errorMessage) {
+    Log.e(TAG, errorMessage);
+    executor.execute(new Runnable() {
+      @Override
+      public void run() {
+        if (roomState != ConnectionState.ERROR) {
+          roomState = ConnectionState.ERROR;
+          events.onChannelError(errorMessage);
+        }
+      }
+    });
+  }
+
+  // Put a |key|->|value| mapping in |json|.
+  private static void jsonPut(JSONObject json, String key, Object value) {
+    try {
+      json.put(key, value);
+    } catch (JSONException e) {
+      throw new RuntimeException(e);
+    }
+  }
+
+  // Send SDP or ICE candidate to a room server.
+  private void sendPostMessage(
+      final MessageType messageType, final String url, final String message) {
+    String logInfo = url;
+    if (message != null) {
+      logInfo += ". Message: " + message;
+    }
+    Log.d(TAG, "C->GAE: " + logInfo);
+    AsyncHttpURLConnection httpConnection = new AsyncHttpURLConnection(
+      "POST", url, message, new AsyncHttpEvents() {
+        @Override
+        public void onHttpError(String errorMessage) {
+          reportError("GAE POST error: " + errorMessage);
+        }
+
+        @Override
+        public void onHttpComplete(String response) {
+          if (messageType == MessageType.MESSAGE) {
+            try {
+              JSONObject roomJson = new JSONObject(response);
+              String result = roomJson.getString("result");
+              if (!result.equals("SUCCESS")) {
+                reportError("GAE POST error: " + result);
+              }
+            } catch (JSONException e) {
+              reportError("GAE POST JSON error: " + e.toString());
+            }
+          }
+        }
+      });
+    httpConnection.send();
+  }
+}
diff --git a/examples/androidapp/src/org/appspot/apprtc/util/AppRTCUtils.java b/examples/androidapp/src/org/appspot/apprtc/util/AppRTCUtils.java
new file mode 100644
index 0000000..db5ef3d
--- /dev/null
+++ b/examples/androidapp/src/org/appspot/apprtc/util/AppRTCUtils.java
@@ -0,0 +1,67 @@
+/*
+ *  Copyright 2014 The WebRTC Project Authors. All rights reserved.
+ *
+ *  Use of this source code is governed by a BSD-style license
+ *  that can be found in the LICENSE file in the root of the source
+ *  tree. An additional intellectual property rights grant can be found
+ *  in the file PATENTS.  All contributing project authors may
+ *  be found in the AUTHORS file in the root of the source tree.
+ */
+
+package org.appspot.apprtc.util;
+
+import android.os.Build;
+import android.util.Log;
+
+/**
+ * AppRTCUtils provides helper functions for managing thread safety.
+ */
+public final class AppRTCUtils {
+
+  private AppRTCUtils() {
+  }
+
+  /**
+   * NonThreadSafe is a helper class used to help verify that methods of a
+   * class are called from the same thread.
+   */
+  public static class NonThreadSafe {
+    private final Long threadId;
+
+    public NonThreadSafe() {
+      // Store thread ID of the creating thread.
+      threadId = Thread.currentThread().getId();
+    }
+
+   /** Checks if the method is called on the valid/creating thread. */
+    public boolean calledOnValidThread() {
+       return threadId.equals(Thread.currentThread().getId());
+    }
+  }
+
+  /** Helper method which throws an exception  when an assertion has failed. */
+  public static void assertIsTrue(boolean condition) {
+    if (!condition) {
+      throw new AssertionError("Expected condition to be true");
+    }
+  }
+
+  /** Helper method for building a string of thread information.*/
+  public static String getThreadInfo() {
+    return "@[name=" + Thread.currentThread().getName()
+        + ", id=" + Thread.currentThread().getId() + "]";
+  }
+
+  /** Information about the current build, taken from system properties. */
+  public static void logDeviceInfo(String tag) {
+    Log.d(tag, "Android SDK: " + Build.VERSION.SDK_INT + ", "
+        + "Release: " + Build.VERSION.RELEASE + ", "
+        + "Brand: " + Build.BRAND + ", "
+        + "Device: " + Build.DEVICE + ", "
+        + "Id: " + Build.ID + ", "
+        + "Hardware: " + Build.HARDWARE + ", "
+        + "Manufacturer: " + Build.MANUFACTURER + ", "
+        + "Model: " + Build.MODEL + ", "
+        + "Product: " + Build.PRODUCT);
+  }
+}
diff --git a/examples/androidapp/src/org/appspot/apprtc/util/AsyncHttpURLConnection.java b/examples/androidapp/src/org/appspot/apprtc/util/AsyncHttpURLConnection.java
new file mode 100644
index 0000000..a56d4ea
--- /dev/null
+++ b/examples/androidapp/src/org/appspot/apprtc/util/AsyncHttpURLConnection.java
@@ -0,0 +1,122 @@
+/*
+ *  Copyright 2015 The WebRTC Project Authors. All rights reserved.
+ *
+ *  Use of this source code is governed by a BSD-style license
+ *  that can be found in the LICENSE file in the root of the source
+ *  tree. An additional intellectual property rights grant can be found
+ *  in the file PATENTS.  All contributing project authors may
+ *  be found in the AUTHORS file in the root of the source tree.
+ */
+
+package org.appspot.apprtc.util;
+
+import java.io.IOException;
+import java.io.InputStream;
+import java.io.OutputStream;
+import java.net.HttpURLConnection;
+import java.net.SocketTimeoutException;
+import java.net.URL;
+import java.util.Scanner;
+
+/**
+ * Asynchronous http requests implementation.
+ */
+public class AsyncHttpURLConnection {
+  private static final int HTTP_TIMEOUT_MS = 8000;
+  private static final String HTTP_ORIGIN = "https://apprtc.appspot.com";
+  private final String method;
+  private final String url;
+  private final String message;
+  private final AsyncHttpEvents events;
+  private String contentType;
+
+  /**
+   * Http requests callbacks.
+   */
+  public interface AsyncHttpEvents {
+    public void onHttpError(String errorMessage);
+    public void onHttpComplete(String response);
+  }
+
+  public AsyncHttpURLConnection(String method, String url, String message,
+      AsyncHttpEvents events) {
+    this.method = method;
+    this.url = url;
+    this.message = message;
+    this.events = events;
+  }
+
+  public void setContentType(String contentType) {
+    this.contentType = contentType;
+  }
+
+  public void send() {
+    Runnable runHttp = new Runnable() {
+      public void run() {
+        sendHttpMessage();
+      }
+    };
+    new Thread(runHttp).start();
+  }
+
+  private void sendHttpMessage() {
+    try {
+      HttpURLConnection connection =
+        (HttpURLConnection) new URL(url).openConnection();
+      byte[] postData = new byte[0];
+      if (message != null) {
+        postData = message.getBytes("UTF-8");
+      }
+      connection.setRequestMethod(method);
+      connection.setUseCaches(false);
+      connection.setDoInput(true);
+      connection.setConnectTimeout(HTTP_TIMEOUT_MS);
+      connection.setReadTimeout(HTTP_TIMEOUT_MS);
+      // TODO(glaznev) - query request origin from pref_room_server_url_key preferences.
+      connection.addRequestProperty("origin", HTTP_ORIGIN);
+      boolean doOutput = false;
+      if (method.equals("POST")) {
+        doOutput = true;
+        connection.setDoOutput(true);
+        connection.setFixedLengthStreamingMode(postData.length);
+      }
+      if (contentType == null) {
+        connection.setRequestProperty("Content-Type", "text/plain; charset=utf-8");
+      } else {
+        connection.setRequestProperty("Content-Type", contentType);
+      }
+
+      // Send POST request.
+      if (doOutput && postData.length > 0) {
+        OutputStream outStream = connection.getOutputStream();
+        outStream.write(postData);
+        outStream.close();
+      }
+
+      // Get response.
+      int responseCode = connection.getResponseCode();
+      if (responseCode != 200) {
+        events.onHttpError("Non-200 response to " + method + " to URL: "
+            + url + " : " + connection.getHeaderField(null));
+        connection.disconnect();
+        return;
+      }
+      InputStream responseStream = connection.getInputStream();
+      String response = drainStream(responseStream);
+      responseStream.close();
+      connection.disconnect();
+      events.onHttpComplete(response);
+    } catch (SocketTimeoutException e) {
+      events.onHttpError("HTTP " + method + " to " + url + " timeout");
+    } catch (IOException e) {
+      events.onHttpError("HTTP " + method + " to " + url + " error: "
+          + e.getMessage());
+    }
+  }
+
+  // Return the contents of an InputStream as a String.
+  private static String drainStream(InputStream in) {
+    Scanner s = new Scanner(in).useDelimiter("\\A");
+    return s.hasNext() ? s.next() : "";
+  }
+}
diff --git a/examples/androidapp/src/org/appspot/apprtc/util/LooperExecutor.java b/examples/androidapp/src/org/appspot/apprtc/util/LooperExecutor.java
new file mode 100644
index 0000000..1563e26
--- /dev/null
+++ b/examples/androidapp/src/org/appspot/apprtc/util/LooperExecutor.java
@@ -0,0 +1,95 @@
+/*
+ *  Copyright 2015 The WebRTC Project Authors. All rights reserved.
+ *
+ *  Use of this source code is governed by a BSD-style license
+ *  that can be found in the LICENSE file in the root of the source
+ *  tree. An additional intellectual property rights grant can be found
+ *  in the file PATENTS.  All contributing project authors may
+ *  be found in the AUTHORS file in the root of the source tree.
+ */
+
+package org.appspot.apprtc.util;
+
+import android.os.Handler;
+import android.os.Looper;
+import android.util.Log;
+
+import java.util.concurrent.Executor;
+
+/**
+ * Looper based executor class.
+ */
+public class LooperExecutor extends Thread implements Executor {
+  private static final String TAG = "LooperExecutor";
+  // Object used to signal that looper thread has started and Handler instance
+  // associated with looper thread has been allocated.
+  private final Object looperStartedEvent = new Object();
+  private Handler handler = null;
+  private boolean running = false;
+  private long threadId;
+
+  @Override
+  public void run() {
+    Looper.prepare();
+    synchronized (looperStartedEvent) {
+      Log.d(TAG, "Looper thread started.");
+      handler = new Handler();
+      threadId = Thread.currentThread().getId();
+      looperStartedEvent.notify();
+    }
+    Looper.loop();
+  }
+
+  public synchronized void requestStart() {
+    if (running) {
+      return;
+    }
+    running = true;
+    handler = null;
+    start();
+    // Wait for Hander allocation.
+    synchronized (looperStartedEvent) {
+      while (handler == null) {
+        try {
+          looperStartedEvent.wait();
+        } catch (InterruptedException e) {
+          Log.e(TAG, "Can not start looper thread");
+          running = false;
+        }
+      }
+    }
+  }
+
+  public synchronized void requestStop() {
+    if (!running) {
+      return;
+    }
+    running = false;
+    handler.post(new Runnable() {
+      @Override
+      public void run() {
+        Looper.myLooper().quit();
+        Log.d(TAG, "Looper thread finished.");
+      }
+    });
+  }
+
+  // Checks if current thread is a looper thread.
+  public boolean checkOnLooperThread() {
+    return (Thread.currentThread().getId() == threadId);
+  }
+
+  @Override
+  public synchronized void execute(final Runnable runnable) {
+    if (!running) {
+      Log.w(TAG, "Running looper executor without calling requestStart()");
+      return;
+    }
+    if (Thread.currentThread().getId() == threadId) {
+      runnable.run();
+    } else {
+      handler.post(runnable);
+    }
+  }
+
+}
diff --git a/examples/androidapp/third_party/autobanh/LICENSE b/examples/androidapp/third_party/autobanh/LICENSE
new file mode 100644
index 0000000..f433b1a
--- /dev/null
+++ b/examples/androidapp/third_party/autobanh/LICENSE
@@ -0,0 +1,177 @@
+
+                                 Apache License
+                           Version 2.0, January 2004
+                        http://www.apache.org/licenses/
+
+   TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
+
+   1. Definitions.
+
+      "License" shall mean the terms and conditions for use, reproduction,
+      and distribution as defined by Sections 1 through 9 of this document.
+
+      "Licensor" shall mean the copyright owner or entity authorized by
+      the copyright owner that is granting the License.
+
+      "Legal Entity" shall mean the union of the acting entity and all
+      other entities that control, are controlled by, or are under common
+      control with that entity. For the purposes of this definition,
+      "control" means (i) the power, direct or indirect, to cause the
+      direction or management of such entity, whether by contract or
+      otherwise, or (ii) ownership of fifty percent (50%) or more of the
+      outstanding shares, or (iii) beneficial ownership of such entity.
+
+      "You" (or "Your") shall mean an individual or Legal Entity
+      exercising permissions granted by this License.
+
+      "Source" form shall mean the preferred form for making modifications,
+      including but not limited to software source code, documentation
+      source, and configuration files.
+
+      "Object" form shall mean any form resulting from mechanical
+      transformation or translation of a Source form, including but
+      not limited to compiled object code, generated documentation,
+      and conversions to other media types.
+
+      "Work" shall mean the work of authorship, whether in Source or
+      Object form, made available under the License, as indicated by a
+      copyright notice that is included in or attached to the work
+      (an example is provided in the Appendix below).
+
+      "Derivative Works" shall mean any work, whether in Source or Object
+      form, that is based on (or derived from) the Work and for which the
+      editorial revisions, annotations, elaborations, or other modifications
+      represent, as a whole, an original work of authorship. For the purposes
+      of this License, Derivative Works shall not include works that remain
+      separable from, or merely link (or bind by name) to the interfaces of,
+      the Work and Derivative Works thereof.
+
+      "Contribution" shall mean any work of authorship, including
+      the original version of the Work and any modifications or additions
+      to that Work or Derivative Works thereof, that is intentionally
+      submitted to Licensor for inclusion in the Work by the copyright owner
+      or by an individual or Legal Entity authorized to submit on behalf of
+      the copyright owner. For the purposes of this definition, "submitted"
+      means any form of electronic, verbal, or written communication sent
+      to the Licensor or its representatives, including but not limited to
+      communication on electronic mailing lists, source code control systems,
+      and issue tracking systems that are managed by, or on behalf of, the
+      Licensor for the purpose of discussing and improving the Work, but
+      excluding communication that is conspicuously marked or otherwise
+      designated in writing by the copyright owner as "Not a Contribution."
+
+      "Contributor" shall mean Licensor and any individual or Legal Entity
+      on behalf of whom a Contribution has been received by Licensor and
+      subsequently incorporated within the Work.
+
+   2. Grant of Copyright License. Subject to the terms and conditions of
+      this License, each Contributor hereby grants to You a perpetual,
+      worldwide, non-exclusive, no-charge, royalty-free, irrevocable
+      copyright license to reproduce, prepare Derivative Works of,
+      publicly display, publicly perform, sublicense, and distribute the
+      Work and such Derivative Works in Source or Object form.
+
+   3. Grant of Patent License. Subject to the terms and conditions of
+      this License, each Contributor hereby grants to You a perpetual,
+      worldwide, non-exclusive, no-charge, royalty-free, irrevocable
+      (except as stated in this section) patent license to make, have made,
+      use, offer to sell, sell, import, and otherwise transfer the Work,
+      where such license applies only to those patent claims licensable
+      by such Contributor that are necessarily infringed by their
+      Contribution(s) alone or by combination of their Contribution(s)
+      with the Work to which such Contribution(s) was submitted. If You
+      institute patent litigation against any entity (including a
+      cross-claim or counterclaim in a lawsuit) alleging that the Work
+      or a Contribution incorporated within the Work constitutes direct
+      or contributory patent infringement, then any patent licenses
+      granted to You under this License for that Work shall terminate
+      as of the date such litigation is filed.
+
+   4. Redistribution. You may reproduce and distribute copies of the
+      Work or Derivative Works thereof in any medium, with or without
+      modifications, and in Source or Object form, provided that You
+      meet the following conditions:
+
+      (a) You must give any other recipients of the Work or
+          Derivative Works a copy of this License; and
+
+      (b) You must cause any modified files to carry prominent notices
+          stating that You changed the files; and
+
+      (c) You must retain, in the Source form of any Derivative Works
+          that You distribute, all copyright, patent, trademark, and
+          attribution notices from the Source form of the Work,
+          excluding those notices that do not pertain to any part of
+          the Derivative Works; and
+
+      (d) If the Work includes a "NOTICE" text file as part of its
+          distribution, then any Derivative Works that You distribute must
+          include a readable copy of the attribution notices contained
+          within such NOTICE file, excluding those notices that do not
+          pertain to any part of the Derivative Works, in at least one
+          of the following places: within a NOTICE text file distributed
+          as part of the Derivative Works; within the Source form or
+          documentation, if provided along with the Derivative Works; or,
+          within a display generated by the Derivative Works, if and
+          wherever such third-party notices normally appear. The contents
+          of the NOTICE file are for informational purposes only and
+          do not modify the License. You may add Your own attribution
+          notices within Derivative Works that You distribute, alongside
+          or as an addendum to the NOTICE text from the Work, provided
+          that such additional attribution notices cannot be construed
+          as modifying the License.
+
+      You may add Your own copyright statement to Your modifications and
+      may provide additional or different license terms and conditions
+      for use, reproduction, or distribution of Your modifications, or
+      for any such Derivative Works as a whole, provided Your use,
+      reproduction, and distribution of the Work otherwise complies with
+      the conditions stated in this License.
+
+   5. Submission of Contributions. Unless You explicitly state otherwise,
+      any Contribution intentionally submitted for inclusion in the Work
+      by You to the Licensor shall be under the terms and conditions of
+      this License, without any additional terms or conditions.
+      Notwithstanding the above, nothing herein shall supersede or modify
+      the terms of any separate license agreement you may have executed
+      with Licensor regarding such Contributions.
+
+   6. Trademarks. This License does not grant permission to use the trade
+      names, trademarks, service marks, or product names of the Licensor,
+      except as required for reasonable and customary use in describing the
+      origin of the Work and reproducing the content of the NOTICE file.
+
+   7. Disclaimer of Warranty. Unless required by applicable law or
+      agreed to in writing, Licensor provides the Work (and each
+      Contributor provides its Contributions) on an "AS IS" BASIS,
+      WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
+      implied, including, without limitation, any warranties or conditions
+      of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
+      PARTICULAR PURPOSE. You are solely responsible for determining the
+      appropriateness of using or redistributing the Work and assume any
+      risks associated with Your exercise of permissions under this License.
+
+   8. Limitation of Liability. In no event and under no legal theory,
+      whether in tort (including negligence), contract, or otherwise,
+      unless required by applicable law (such as deliberate and grossly
+      negligent acts) or agreed to in writing, shall any Contributor be
+      liable to You for damages, including any direct, indirect, special,
+      incidental, or consequential damages of any character arising as a
+      result of this License or out of the use or inability to use the
+      Work (including but not limited to damages for loss of goodwill,
+      work stoppage, computer failure or malfunction, or any and all
+      other commercial damages or losses), even if such Contributor
+      has been advised of the possibility of such damages.
+
+   9. Accepting Warranty or Additional Liability. While redistributing
+      the Work or Derivative Works thereof, You may choose to offer,
+      and charge a fee for, acceptance of support, warranty, indemnity,
+      or other liability obligations and/or rights consistent with this
+      License. However, in accepting such obligations, You may act only
+      on Your own behalf and on Your sole responsibility, not on behalf
+      of any other Contributor, and only if You agree to indemnify,
+      defend, and hold each Contributor harmless for any liability
+      incurred by, or claims asserted against, such Contributor by reason
+      of your accepting any such warranty or additional liability.
+
+   END OF TERMS AND CONDITIONS
diff --git a/examples/androidapp/third_party/autobanh/LICENSE.md b/examples/androidapp/third_party/autobanh/LICENSE.md
new file mode 100644
index 0000000..2079e90
--- /dev/null
+++ b/examples/androidapp/third_party/autobanh/LICENSE.md
@@ -0,0 +1,21 @@
+The MIT License (MIT)
+
+Copyright (c) 2014 Cameron Lowell Palmer
+
+Permission is hereby granted, free of charge, to any person obtaining a copy
+of this software and associated documentation files (the "Software"), to deal
+in the Software without restriction, including without limitation the rights
+to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
+copies of the Software, and to permit persons to whom the Software is
+furnished to do so, subject to the following conditions:
+
+The above copyright notice and this permission notice shall be included in all
+copies or substantial portions of the Software.
+
+THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
+IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
+FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
+AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
+LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
+OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
+SOFTWARE.
diff --git a/examples/androidapp/third_party/autobanh/NOTICE b/examples/androidapp/third_party/autobanh/NOTICE
new file mode 100644
index 0000000..91ed7df
--- /dev/null
+++ b/examples/androidapp/third_party/autobanh/NOTICE
@@ -0,0 +1,3 @@
+AutobahnAndroid
+Copyright 2011,2012 Tavendo GmbH. Licensed under Apache 2.0
+This product includes software developed at Tavendo GmbH http://www.tavendo.de
diff --git a/examples/androidapp/third_party/autobanh/autobanh.jar b/examples/androidapp/third_party/autobanh/autobanh.jar
new file mode 100644
index 0000000..5a10b7f
--- /dev/null
+++ b/examples/androidapp/third_party/autobanh/autobanh.jar
Binary files differ
diff --git a/examples/androidtests/AndroidManifest.xml b/examples/androidtests/AndroidManifest.xml
new file mode 100644
index 0000000..f99f477
--- /dev/null
+++ b/examples/androidtests/AndroidManifest.xml
@@ -0,0 +1,17 @@
+<?xml version="1.0" encoding="utf-8"?>
+<manifest xmlns:android="http://schemas.android.com/apk/res/android"
+    package="org.appspot.apprtc.test"
+    android:versionCode="1"
+    android:versionName="1.0" >
+
+    <uses-sdk android:minSdkVersion="13" android:targetSdkVersion="21" />
+
+    <instrumentation
+        android:name="android.test.InstrumentationTestRunner"
+        android:targetPackage="org.appspot.apprtc" />
+
+    <application>
+        <uses-library android:name="android.test.runner" />
+    </application>
+
+</manifest>
\ No newline at end of file
diff --git a/examples/androidtests/README b/examples/androidtests/README
new file mode 100644
index 0000000..d32fb56
--- /dev/null
+++ b/examples/androidtests/README
@@ -0,0 +1,14 @@
+This directory contains an example unit test for Android AppRTCDemo.
+
+Example of building & using the app:
+
+- Build Android AppRTCDemo and AppRTCDemo unit test:
+cd <path/to/webrtc>/src
+ninja -C out/Debug AppRTCDemoTest
+
+- Install AppRTCDemo and AppRTCDemoTest:
+adb install -r out/Debug/apks/AppRTCDemo.apk
+adb install -r out/Debug/apks/AppRTCDemoTest.apk
+
+- Run unit tests:
+adb shell am instrument -w org.appspot.apprtc.test/android.test.InstrumentationTestRunner
\ No newline at end of file
diff --git a/examples/androidtests/ant.properties b/examples/androidtests/ant.properties
new file mode 100644
index 0000000..ec7d042
--- /dev/null
+++ b/examples/androidtests/ant.properties
@@ -0,0 +1,18 @@
+# This file is used to override default values used by the Ant build system.
+#
+# This file must be checked into Version Control Systems, as it is
+# integral to the build system of your project.
+
+# This file is only used by the Ant script.
+
+# You can use this to override default values such as
+#  'source.dir' for the location of your java source folder and
+#  'out.dir' for the location of your output folder.
+
+# You can also use it define how the release builds are signed by declaring
+# the following properties:
+#  'key.store' for the location of your keystore and
+#  'key.alias' for the name of the key to use.
+# The password will be asked during the build when you use the 'release' target.
+
+tested.project.dir=../android
diff --git a/examples/androidtests/build.xml b/examples/androidtests/build.xml
new file mode 100644
index 0000000..036759b
--- /dev/null
+++ b/examples/androidtests/build.xml
@@ -0,0 +1,92 @@
+<?xml version="1.0" encoding="UTF-8"?>
+<project name="AppRTCDemoTest" default="help">
+
+    <!-- The local.properties file is created and updated by the 'android' tool.
+         It contains the path to the SDK. It should *NOT* be checked into
+         Version Control Systems. -->
+    <property file="local.properties" />
+
+    <!-- The ant.properties file can be created by you. It is only edited by the
+         'android' tool to add properties to it.
+         This is the place to change some Ant specific build properties.
+         Here are some properties you may want to change/update:
+
+         source.dir
+             The name of the source directory. Default is 'src'.
+         out.dir
+             The name of the output directory. Default is 'bin'.
+
+         For other overridable properties, look at the beginning of the rules
+         files in the SDK, at tools/ant/build.xml
+
+         Properties related to the SDK location or the project target should
+         be updated using the 'android' tool with the 'update' action.
+
+         This file is an integral part of the build system for your
+         application and should be checked into Version Control Systems.
+
+         -->
+    <property file="ant.properties" />
+
+    <!-- if sdk.dir was not set from one of the property file, then
+         get it from the ANDROID_HOME env var.
+         This must be done before we load project.properties since
+         the proguard config can use sdk.dir -->
+    <property environment="env" />
+    <condition property="sdk.dir" value="${env.ANDROID_SDK_ROOT}">
+        <isset property="env.ANDROID_SDK_ROOT" />
+    </condition>
+
+    <!-- The project.properties file is created and updated by the 'android'
+         tool, as well as ADT.
+
+         This contains project specific properties such as project target, and library
+         dependencies. Lower level build properties are stored in ant.properties
+         (or in .classpath for Eclipse projects).
+
+         This file is an integral part of the build system for your
+         application and should be checked into Version Control Systems. -->
+    <loadproperties srcFile="project.properties" />
+
+    <!-- quick check on sdk.dir -->
+    <fail
+            message="sdk.dir is missing. Make sure to generate local.properties using 'android update project' or to inject it through the ANDROID_HOME environment variable."
+            unless="sdk.dir"
+    />
+
+    <!--
+        Import per project custom build rules if present at the root of the project.
+        This is the place to put custom intermediary targets such as:
+            -pre-build
+            -pre-compile
+            -post-compile (This is typically used for code obfuscation.
+                           Compiled code location: ${out.classes.absolute.dir}
+                           If this is not done in place, override ${out.dex.input.absolute.dir})
+            -post-package
+            -post-build
+            -pre-clean
+    -->
+    <import file="custom_rules.xml" optional="true" />
+
+    <!-- Import the actual build file.
+
+         To customize existing targets, there are two options:
+         - Customize only one target:
+             - copy/paste the target into this file, *before* the
+               <import> task.
+             - customize it to your needs.
+         - Customize the whole content of build.xml
+             - copy/paste the content of the rules files (minus the top node)
+               into this file, replacing the <import> task.
+             - customize to your needs.
+
+         ***********************
+         ****** IMPORTANT ******
+         ***********************
+         In all cases you must update the value of version-tag below to read 'custom' instead of an integer,
+         in order to avoid having your file be overridden by tools such as "android update project"
+    -->
+    <!-- version-tag: 1 -->
+    <import file="${sdk.dir}/tools/ant/build.xml" />
+
+</project>
diff --git a/examples/androidtests/project.properties b/examples/androidtests/project.properties
new file mode 100644
index 0000000..a6ca533
--- /dev/null
+++ b/examples/androidtests/project.properties
@@ -0,0 +1,16 @@
+# This file is automatically generated by Android Tools.
+# Do not modify this file -- YOUR CHANGES WILL BE ERASED!
+#
+# This file must be checked in Version Control Systems.
+#
+# To customize properties used by the Ant build system edit
+# "ant.properties", and override values to adapt the script to your
+# project structure.
+#
+# To enable ProGuard to shrink and obfuscate your code, uncomment this (available properties: sdk.dir, user.home):
+#proguard.config=${sdk.dir}/tools/proguard/proguard-android.txt:proguard-project.txt
+
+# Project target.
+target=android-22
+
+java.compilerargs=-Xlint:all -Werror
diff --git a/examples/androidtests/src/org/appspot/apprtc/test/LooperExecutorTest.java b/examples/androidtests/src/org/appspot/apprtc/test/LooperExecutorTest.java
new file mode 100644
index 0000000..29ccaef
--- /dev/null
+++ b/examples/androidtests/src/org/appspot/apprtc/test/LooperExecutorTest.java
@@ -0,0 +1,67 @@
+/*
+ *  Copyright 2015 The WebRTC Project Authors. All rights reserved.
+ *
+ *  Use of this source code is governed by a BSD-style license
+ *  that can be found in the LICENSE file in the root of the source
+ *  tree. An additional intellectual property rights grant can be found
+ *  in the file PATENTS.  All contributing project authors may
+ *  be found in the AUTHORS file in the root of the source tree.
+ */
+
+package org.appspot.apprtc.test;
+
+import java.util.concurrent.CountDownLatch;
+import java.util.concurrent.TimeUnit;
+
+import org.appspot.apprtc.util.LooperExecutor;
+
+import android.test.InstrumentationTestCase;
+import android.util.Log;
+
+public class LooperExecutorTest extends InstrumentationTestCase {
+  private static final String TAG = "LooperTest";
+  private static final int WAIT_TIMEOUT = 5000;
+
+  public void testLooperExecutor() throws InterruptedException {
+    Log.d(TAG, "testLooperExecutor");
+    final int counter[] = new int[1];
+    final int expectedCounter = 10;
+    final CountDownLatch looperDone = new CountDownLatch(1);
+
+    Runnable counterIncRunnable = new Runnable() {
+      @Override
+      public void run() {
+        counter[0]++;
+        Log.d(TAG, "Run " + counter[0]);
+      }
+    };
+    LooperExecutor executor = new LooperExecutor();
+
+    // Try to execute a counter increment task before starting an executor.
+    executor.execute(counterIncRunnable);
+
+    // Start the executor and run expected amount of counter increment task.
+    executor.requestStart();
+    for (int i = 0; i < expectedCounter; i++) {
+      executor.execute(counterIncRunnable);
+    }
+    executor.execute(new Runnable() {
+      @Override
+      public void run() {
+        looperDone.countDown();
+      }
+    });
+    executor.requestStop();
+
+    // Try to execute a task after stopping the executor.
+    executor.execute(counterIncRunnable);
+
+    // Wait for final looper task and make sure the counter increment task
+    // is executed expected amount of times.
+    looperDone.await(WAIT_TIMEOUT, TimeUnit.MILLISECONDS);
+    assertTrue (looperDone.getCount() == 0);
+    assertTrue (counter[0] == expectedCounter);
+
+    Log.d(TAG, "testLooperExecutor done");
+  }
+}
diff --git a/examples/androidtests/src/org/appspot/apprtc/test/PeerConnectionClientTest.java b/examples/androidtests/src/org/appspot/apprtc/test/PeerConnectionClientTest.java
new file mode 100644
index 0000000..00a8187
--- /dev/null
+++ b/examples/androidtests/src/org/appspot/apprtc/test/PeerConnectionClientTest.java
@@ -0,0 +1,438 @@
+/*
+ *  Copyright 2014 The WebRTC Project Authors. All rights reserved.
+ *
+ *  Use of this source code is governed by a BSD-style license
+ *  that can be found in the LICENSE file in the root of the source
+ *  tree. An additional intellectual property rights grant can be found
+ *  in the file PATENTS.  All contributing project authors may
+ *  be found in the AUTHORS file in the root of the source tree.
+ */
+
+package org.appspot.apprtc.test;
+
+import java.util.LinkedList;
+import java.util.List;
+import java.util.concurrent.CountDownLatch;
+import java.util.concurrent.TimeUnit;
+
+import org.appspot.apprtc.AppRTCClient.SignalingParameters;
+import org.appspot.apprtc.PeerConnectionClient;
+import org.appspot.apprtc.PeerConnectionClient.PeerConnectionEvents;
+import org.appspot.apprtc.PeerConnectionClient.PeerConnectionParameters;
+import org.appspot.apprtc.util.LooperExecutor;
+import org.webrtc.IceCandidate;
+import org.webrtc.MediaConstraints;
+import org.webrtc.PeerConnection;
+import org.webrtc.PeerConnectionFactory;
+import org.webrtc.SessionDescription;
+import org.webrtc.StatsReport;
+import org.webrtc.VideoRenderer;
+
+import android.test.InstrumentationTestCase;
+import android.util.Log;
+
+public class PeerConnectionClientTest extends InstrumentationTestCase
+    implements PeerConnectionEvents {
+  private static final String TAG = "RTCClientTest";
+  private static final int ICE_CONNECTION_WAIT_TIMEOUT = 10000;
+  private static final int WAIT_TIMEOUT = 7000;
+  private static final int CAMERA_SWITCH_ATTEMPTS = 3;
+  private static final int VIDEO_RESTART_ATTEMPTS = 3;
+  private static final int VIDEO_RESTART_TIMEOUT = 500;
+  private static final int EXPECTED_VIDEO_FRAMES = 10;
+  private static final String VIDEO_CODEC_VP8 = "VP8";
+  private static final String VIDEO_CODEC_VP9 = "VP9";
+  private static final String VIDEO_CODEC_H264 = "H264";
+  private static final int AUDIO_RUN_TIMEOUT = 1000;
+  private static final String LOCAL_RENDERER_NAME = "Local renderer";
+  private static final String REMOTE_RENDERER_NAME = "Remote renderer";
+
+  // The peer connection client is assumed to be thread safe in itself; the
+  // reference is written by the test thread and read by worker threads.
+  private volatile PeerConnectionClient pcClient;
+  private volatile boolean loopback;
+
+  // These are protected by their respective event objects.
+  private LooperExecutor signalingExecutor;
+  private boolean isClosed;
+  private boolean isIceConnected;
+  private SessionDescription localSdp;
+  private List<IceCandidate> iceCandidates = new LinkedList<IceCandidate>();
+  private final Object localSdpEvent = new Object();
+  private final Object iceCandidateEvent = new Object();
+  private final Object iceConnectedEvent = new Object();
+  private final Object closeEvent = new Object();
+
+  // Mock renderer implementation.
+  private static class MockRenderer implements VideoRenderer.Callbacks {
+    // These are protected by 'this' since we gets called from worker threads.
+    private String rendererName;
+    private boolean renderFrameCalled = false;
+
+    // Thread-safe in itself.
+    private CountDownLatch doneRendering;
+
+    public MockRenderer(int expectedFrames, String rendererName) {
+      this.rendererName = rendererName;
+      reset(expectedFrames);
+    }
+
+    // Resets render to wait for new amount of video frames.
+    public synchronized void reset(int expectedFrames) {
+      renderFrameCalled = false;
+      doneRendering = new CountDownLatch(expectedFrames);
+    }
+
+    // TODO(guoweis): Remove this once chrome code base is updated.
+    @Override
+    public boolean canApplyRotation() {
+      return false;
+    }
+
+    @Override
+    public synchronized void renderFrame(VideoRenderer.I420Frame frame) {
+      if (!renderFrameCalled) {
+        if (rendererName != null) {
+          Log.d(TAG, rendererName + " render frame: " + frame.width + " x " + frame.height);
+        } else {
+          Log.d(TAG, "Render frame: " + frame.width + " x " + frame.height);
+        }
+      }
+      renderFrameCalled = true;
+      doneRendering.countDown();
+    }
+
+
+    // This method shouldn't hold any locks or touch member variables since it
+    // blocks.
+    public boolean waitForFramesRendered(int timeoutMs)
+        throws InterruptedException {
+      doneRendering.await(timeoutMs, TimeUnit.MILLISECONDS);
+      return (doneRendering.getCount() <= 0);
+    }
+  }
+
+  // Peer connection events implementation.
+  @Override
+  public void onLocalDescription(SessionDescription sdp) {
+    Log.d(TAG, "LocalSDP type: " + sdp.type);
+    synchronized (localSdpEvent) {
+      localSdp = sdp;
+      localSdpEvent.notifyAll();
+    }
+  }
+
+  @Override
+  public void onIceCandidate(final IceCandidate candidate) {
+    synchronized(iceCandidateEvent) {
+      Log.d(TAG, "IceCandidate #" + iceCandidates.size() + " : " + candidate.toString());
+      if (loopback) {
+        // Loopback local ICE candidate in a separate thread to avoid adding
+        // remote ICE candidate in a local ICE candidate callback.
+        signalingExecutor.execute(new Runnable() {
+          @Override
+          public void run() {
+            pcClient.addRemoteIceCandidate(candidate);
+          }
+        });
+      }
+      iceCandidates.add(candidate);
+      iceCandidateEvent.notifyAll();
+    }
+  }
+
+  @Override
+  public void onIceConnected() {
+    Log.d(TAG, "ICE Connected");
+    synchronized(iceConnectedEvent) {
+      isIceConnected = true;
+      iceConnectedEvent.notifyAll();
+    }
+  }
+
+  @Override
+  public void onIceDisconnected() {
+    Log.d(TAG, "ICE Disconnected");
+    synchronized(iceConnectedEvent) {
+      isIceConnected = false;
+      iceConnectedEvent.notifyAll();
+    }
+  }
+
+  @Override
+  public void onPeerConnectionClosed() {
+    Log.d(TAG, "PeerConnection closed");
+    synchronized(closeEvent) {
+      isClosed = true;
+      closeEvent.notifyAll();
+    }
+  }
+
+  @Override
+  public void onPeerConnectionError(String description) {
+    fail("PC Error: " + description);
+  }
+
+  @Override
+  public void onPeerConnectionStatsReady(StatsReport[] reports) {
+  }
+
+  // Helper wait functions.
+  private boolean waitForLocalSDP(int timeoutMs)
+      throws InterruptedException {
+    synchronized(localSdpEvent) {
+      if (localSdp == null) {
+        localSdpEvent.wait(timeoutMs);
+      }
+      return (localSdp != null);
+    }
+  }
+
+  private boolean waitForIceCandidates(int timeoutMs)
+      throws InterruptedException {
+    synchronized(iceCandidateEvent) {
+      if (iceCandidates.size() == 0) {
+        iceCandidateEvent.wait(timeoutMs);
+      }
+      return (iceCandidates.size() > 0);
+    }
+  }
+
+  private boolean waitForIceConnected(int timeoutMs)
+      throws InterruptedException {
+    synchronized(iceConnectedEvent) {
+      if (!isIceConnected) {
+        iceConnectedEvent.wait(timeoutMs);
+      }
+      if (!isIceConnected) {
+        Log.e(TAG, "ICE connection failure");
+      }
+
+      return isIceConnected;
+    }
+  }
+
+  private boolean waitForPeerConnectionClosed(int timeoutMs)
+      throws InterruptedException {
+    synchronized(closeEvent) {
+      if (!isClosed) {
+        closeEvent.wait(timeoutMs);
+      }
+      return isClosed;
+    }
+  }
+
+  PeerConnectionClient createPeerConnectionClient(
+      MockRenderer localRenderer, MockRenderer remoteRenderer,
+      boolean enableVideo, String videoCodec) {
+    List<PeerConnection.IceServer> iceServers =
+        new LinkedList<PeerConnection.IceServer>();
+    SignalingParameters signalingParameters = new SignalingParameters(
+        iceServers, true, // iceServers, initiator.
+        null, null, null, // clientId, wssUrl, wssPostUrl.
+        null, null); // offerSdp, iceCandidates.
+    PeerConnectionParameters peerConnectionParameters =
+        new PeerConnectionParameters(
+            enableVideo, true, // videoCallEnabled, loopback.
+            0, 0, 0, 0, videoCodec, true, // video codec parameters.
+            0, "OPUS", false, true); // audio codec parameters.
+
+    PeerConnectionClient client = PeerConnectionClient.getInstance();
+    PeerConnectionFactory.Options options = new PeerConnectionFactory.Options();
+    options.networkIgnoreMask = 0;
+    client.setPeerConnectionFactoryOptions(options);
+    client.createPeerConnectionFactory(
+        getInstrumentation().getContext(), null,
+        peerConnectionParameters, this);
+    client.createPeerConnection(
+        localRenderer, remoteRenderer, signalingParameters);
+    client.createOffer();
+    return client;
+  }
+
+  @Override
+  public void setUp() {
+    signalingExecutor = new LooperExecutor();
+    signalingExecutor.requestStart();
+  }
+
+  @Override
+  public void tearDown() {
+    signalingExecutor.requestStop();
+  }
+
+  public void testSetLocalOfferMakesVideoFlowLocally()
+      throws InterruptedException {
+    Log.d(TAG, "testSetLocalOfferMakesVideoFlowLocally");
+    MockRenderer localRenderer = new MockRenderer(EXPECTED_VIDEO_FRAMES, LOCAL_RENDERER_NAME);
+    pcClient = createPeerConnectionClient(
+        localRenderer, new MockRenderer(0, null), true, VIDEO_CODEC_VP8);
+
+    // Wait for local SDP and ice candidates set events.
+    assertTrue("Local SDP was not set.", waitForLocalSDP(WAIT_TIMEOUT));
+    assertTrue("ICE candidates were not generated.",
+        waitForIceCandidates(WAIT_TIMEOUT));
+
+    // Check that local video frames were rendered.
+    assertTrue("Local video frames were not rendered.",
+        localRenderer.waitForFramesRendered(WAIT_TIMEOUT));
+
+    pcClient.close();
+    assertTrue("PeerConnection close event was not received.",
+        waitForPeerConnectionClosed(WAIT_TIMEOUT));
+    Log.d(TAG, "testSetLocalOfferMakesVideoFlowLocally Done.");
+  }
+
+  private void doLoopbackTest(boolean enableVideo, String videoCodec)
+      throws InterruptedException {
+    loopback = true;
+    MockRenderer localRenderer = null;
+    MockRenderer remoteRenderer = null;
+    if (enableVideo) {
+      Log.d(TAG, "testLoopback for video " + videoCodec);
+      localRenderer = new MockRenderer(EXPECTED_VIDEO_FRAMES, LOCAL_RENDERER_NAME);
+      remoteRenderer = new MockRenderer(EXPECTED_VIDEO_FRAMES, REMOTE_RENDERER_NAME);
+    } else {
+      Log.d(TAG, "testLoopback for audio.");
+    }
+    pcClient = createPeerConnectionClient(
+        localRenderer, remoteRenderer, enableVideo, videoCodec);
+
+    // Wait for local SDP, rename it to answer and set as remote SDP.
+    assertTrue("Local SDP was not set.", waitForLocalSDP(WAIT_TIMEOUT));
+    SessionDescription remoteSdp = new SessionDescription(
+        SessionDescription.Type.fromCanonicalForm("answer"),
+        localSdp.description);
+    pcClient.setRemoteDescription(remoteSdp);
+
+    // Wait for ICE connection.
+    assertTrue("ICE connection failure.", waitForIceConnected(ICE_CONNECTION_WAIT_TIMEOUT));
+
+    if (enableVideo) {
+      // Check that local and remote video frames were rendered.
+      assertTrue("Local video frames were not rendered.",
+          localRenderer.waitForFramesRendered(WAIT_TIMEOUT));
+      assertTrue("Remote video frames were not rendered.",
+          remoteRenderer.waitForFramesRendered(WAIT_TIMEOUT));
+    } else {
+      // For audio just sleep for 1 sec.
+      // TODO(glaznev): check how we can detect that remote audio was rendered.
+      Thread.sleep(AUDIO_RUN_TIMEOUT);
+    }
+
+    pcClient.close();
+    assertTrue(waitForPeerConnectionClosed(WAIT_TIMEOUT));
+    Log.d(TAG, "testLoopback done.");
+  }
+
+  public void testLoopbackAudio() throws InterruptedException {
+    doLoopbackTest(false, VIDEO_CODEC_VP8);
+  }
+
+  public void testLoopbackVp8() throws InterruptedException {
+    doLoopbackTest(true, VIDEO_CODEC_VP8);
+  }
+
+  public void DISABLED_testLoopbackVp9() throws InterruptedException {
+    doLoopbackTest(true, VIDEO_CODEC_VP9);
+  }
+
+  public void testLoopbackH264() throws InterruptedException {
+    doLoopbackTest(true, VIDEO_CODEC_H264);
+  }
+
+  // Checks if default front camera can be switched to back camera and then
+  // again to front camera.
+  public void testCameraSwitch() throws InterruptedException {
+    Log.d(TAG, "testCameraSwitch");
+    loopback = true;
+
+    MockRenderer localRenderer = new MockRenderer(EXPECTED_VIDEO_FRAMES, LOCAL_RENDERER_NAME);
+    MockRenderer remoteRenderer = new MockRenderer(EXPECTED_VIDEO_FRAMES, REMOTE_RENDERER_NAME);
+
+    pcClient = createPeerConnectionClient(
+        localRenderer, remoteRenderer, true, VIDEO_CODEC_VP8);
+
+    // Wait for local SDP, rename it to answer and set as remote SDP.
+    assertTrue("Local SDP was not set.", waitForLocalSDP(WAIT_TIMEOUT));
+    SessionDescription remoteSdp = new SessionDescription(
+        SessionDescription.Type.fromCanonicalForm("answer"),
+        localSdp.description);
+    pcClient.setRemoteDescription(remoteSdp);
+
+    // Wait for ICE connection.
+    assertTrue("ICE connection failure.", waitForIceConnected(ICE_CONNECTION_WAIT_TIMEOUT));
+
+    // Check that local and remote video frames were rendered.
+    assertTrue("Local video frames were not rendered before camera switch.",
+        localRenderer.waitForFramesRendered(WAIT_TIMEOUT));
+    assertTrue("Remote video frames were not rendered before camera switch.",
+        remoteRenderer.waitForFramesRendered(WAIT_TIMEOUT));
+
+    for (int i = 0; i < CAMERA_SWITCH_ATTEMPTS; i++) {
+      // Try to switch camera
+      pcClient.switchCamera();
+
+      // Reset video renders and check that local and remote video frames
+      // were rendered after camera switch.
+      localRenderer.reset(EXPECTED_VIDEO_FRAMES);
+      remoteRenderer.reset(EXPECTED_VIDEO_FRAMES);
+      assertTrue("Local video frames were not rendered after camera switch.",
+          localRenderer.waitForFramesRendered(WAIT_TIMEOUT));
+      assertTrue("Remote video frames were not rendered after camera switch.",
+          remoteRenderer.waitForFramesRendered(WAIT_TIMEOUT));
+    }
+    pcClient.close();
+    assertTrue(waitForPeerConnectionClosed(WAIT_TIMEOUT));
+    Log.d(TAG, "testCameraSwitch done.");
+  }
+
+  // Checks if video source can be restarted - simulate app goes to
+  // background and back to foreground.
+  public void testVideoSourceRestart() throws InterruptedException {
+    Log.d(TAG, "testVideoSourceRestart");
+    loopback = true;
+
+    MockRenderer localRenderer = new MockRenderer(EXPECTED_VIDEO_FRAMES, LOCAL_RENDERER_NAME);
+    MockRenderer remoteRenderer = new MockRenderer(EXPECTED_VIDEO_FRAMES, REMOTE_RENDERER_NAME);
+
+    pcClient = createPeerConnectionClient(
+        localRenderer, remoteRenderer, true, VIDEO_CODEC_VP8);
+
+    // Wait for local SDP, rename it to answer and set as remote SDP.
+    assertTrue("Local SDP was not set.", waitForLocalSDP(WAIT_TIMEOUT));
+    SessionDescription remoteSdp = new SessionDescription(
+        SessionDescription.Type.fromCanonicalForm("answer"),
+        localSdp.description);
+    pcClient.setRemoteDescription(remoteSdp);
+
+    // Wait for ICE connection.
+    assertTrue("ICE connection failure.", waitForIceConnected(ICE_CONNECTION_WAIT_TIMEOUT));
+
+    // Check that local and remote video frames were rendered.
+    assertTrue("Local video frames were not rendered before video restart.",
+        localRenderer.waitForFramesRendered(WAIT_TIMEOUT));
+    assertTrue("Remote video frames were not rendered before video restart.",
+        remoteRenderer.waitForFramesRendered(WAIT_TIMEOUT));
+
+    // Stop and then start video source a few times.
+    for (int i = 0; i < VIDEO_RESTART_ATTEMPTS; i++) {
+      pcClient.stopVideoSource();
+      Thread.sleep(VIDEO_RESTART_TIMEOUT);
+      pcClient.startVideoSource();
+
+      // Reset video renders and check that local and remote video frames
+      // were rendered after video restart.
+      localRenderer.reset(EXPECTED_VIDEO_FRAMES);
+      remoteRenderer.reset(EXPECTED_VIDEO_FRAMES);
+      assertTrue("Local video frames were not rendered after video restart.",
+          localRenderer.waitForFramesRendered(WAIT_TIMEOUT));
+      assertTrue("Remote video frames were not rendered after video restart.",
+          remoteRenderer.waitForFramesRendered(WAIT_TIMEOUT));
+    }
+    pcClient.close();
+    assertTrue(waitForPeerConnectionClosed(WAIT_TIMEOUT));
+    Log.d(TAG, "testVideoSourceRestart done.");
+  }
+
+}
diff --git a/examples/objc/.clang-format b/examples/objc/.clang-format
new file mode 120000
index 0000000..ce43d52
--- /dev/null
+++ b/examples/objc/.clang-format
@@ -0,0 +1 @@
+../../app/webrtc/objc/.clang-format
\ No newline at end of file
diff --git a/examples/objc/AppRTCDemo/ARDAppClient+Internal.h b/examples/objc/AppRTCDemo/ARDAppClient+Internal.h
new file mode 100644
index 0000000..c4a3871
--- /dev/null
+++ b/examples/objc/AppRTCDemo/ARDAppClient+Internal.h
@@ -0,0 +1,52 @@
+/*
+ *  Copyright 2014 The WebRTC Project Authors. All rights reserved.
+ *
+ *  Use of this source code is governed by a BSD-style license
+ *  that can be found in the LICENSE file in the root of the source
+ *  tree. An additional intellectual property rights grant can be found
+ *  in the file PATENTS.  All contributing project authors may
+ *  be found in the AUTHORS file in the root of the source tree.
+ */
+
+#import "ARDAppClient.h"
+
+#import "ARDRoomServerClient.h"
+#import "ARDSignalingChannel.h"
+#import "ARDTURNClient.h"
+#import "RTCPeerConnection.h"
+#import "RTCPeerConnectionDelegate.h"
+#import "RTCPeerConnectionFactory.h"
+#import "RTCSessionDescriptionDelegate.h"
+
+@interface ARDAppClient () <ARDSignalingChannelDelegate,
+    RTCPeerConnectionDelegate, RTCSessionDescriptionDelegate>
+
+// All properties should only be mutated from the main queue.
+@property(nonatomic, strong) id<ARDRoomServerClient> roomServerClient;
+@property(nonatomic, strong) id<ARDSignalingChannel> channel;
+@property(nonatomic, strong) id<ARDTURNClient> turnClient;
+
+@property(nonatomic, strong) RTCPeerConnection *peerConnection;
+@property(nonatomic, strong) RTCPeerConnectionFactory *factory;
+@property(nonatomic, strong) NSMutableArray *messageQueue;
+
+@property(nonatomic, assign) BOOL isTurnComplete;
+@property(nonatomic, assign) BOOL hasReceivedSdp;
+@property(nonatomic, readonly) BOOL hasJoinedRoomServerRoom;
+
+@property(nonatomic, strong) NSString *roomId;
+@property(nonatomic, strong) NSString *clientId;
+@property(nonatomic, assign) BOOL isInitiator;
+@property(nonatomic, strong) NSMutableArray *iceServers;
+@property(nonatomic, strong) NSURL *webSocketURL;
+@property(nonatomic, strong) NSURL *webSocketRestURL;
+
+@property(nonatomic, strong)
+    RTCMediaConstraints *defaultPeerConnectionConstraints;
+
+- (instancetype)initWithRoomServerClient:(id<ARDRoomServerClient>)rsClient
+                        signalingChannel:(id<ARDSignalingChannel>)channel
+                              turnClient:(id<ARDTURNClient>)turnClient
+                                delegate:(id<ARDAppClientDelegate>)delegate;
+
+@end
diff --git a/examples/objc/AppRTCDemo/ARDAppClient.h b/examples/objc/AppRTCDemo/ARDAppClient.h
new file mode 100644
index 0000000..04993e4
--- /dev/null
+++ b/examples/objc/AppRTCDemo/ARDAppClient.h
@@ -0,0 +1,67 @@
+/*
+ *  Copyright 2014 The WebRTC Project Authors. All rights reserved.
+ *
+ *  Use of this source code is governed by a BSD-style license
+ *  that can be found in the LICENSE file in the root of the source
+ *  tree. An additional intellectual property rights grant can be found
+ *  in the file PATENTS.  All contributing project authors may
+ *  be found in the AUTHORS file in the root of the source tree.
+ */
+
+#import <Foundation/Foundation.h>
+
+#import "RTCVideoTrack.h"
+
+typedef NS_ENUM(NSInteger, ARDAppClientState) {
+  // Disconnected from servers.
+  kARDAppClientStateDisconnected,
+  // Connecting to servers.
+  kARDAppClientStateConnecting,
+  // Connected to servers.
+  kARDAppClientStateConnected,
+};
+
+@class ARDAppClient;
+// The delegate is informed of pertinent events and will be called on the
+// main queue.
+@protocol ARDAppClientDelegate <NSObject>
+
+- (void)appClient:(ARDAppClient *)client
+    didChangeState:(ARDAppClientState)state;
+
+- (void)appClient:(ARDAppClient *)client
+    didChangeConnectionState:(RTCICEConnectionState)state;
+
+- (void)appClient:(ARDAppClient *)client
+    didReceiveLocalVideoTrack:(RTCVideoTrack *)localVideoTrack;
+
+- (void)appClient:(ARDAppClient *)client
+    didReceiveRemoteVideoTrack:(RTCVideoTrack *)remoteVideoTrack;
+
+- (void)appClient:(ARDAppClient *)client
+         didError:(NSError *)error;
+
+@end
+
+// Handles connections to the AppRTC server for a given room. Methods on this
+// class should only be called from the main queue.
+@interface ARDAppClient : NSObject
+
+@property(nonatomic, readonly) ARDAppClientState state;
+@property(nonatomic, weak) id<ARDAppClientDelegate> delegate;
+
+// Convenience constructor since all expected use cases will need a delegate
+// in order to receive remote tracks.
+- (instancetype)initWithDelegate:(id<ARDAppClientDelegate>)delegate;
+
+// Establishes a connection with the AppRTC servers for the given room id.
+// TODO(tkchin): provide available keys/values for options. This will be used
+// for call configurations such as overriding server choice, specifying codecs
+// and so on.
+- (void)connectToRoomWithId:(NSString *)roomId
+                    options:(NSDictionary *)options;
+
+// Disconnects from the AppRTC servers and any connected clients.
+- (void)disconnect;
+
+@end
diff --git a/examples/objc/AppRTCDemo/ARDAppClient.m b/examples/objc/AppRTCDemo/ARDAppClient.m
new file mode 100644
index 0000000..bcc7460
--- /dev/null
+++ b/examples/objc/AppRTCDemo/ARDAppClient.m
@@ -0,0 +1,641 @@
+/*
+ *  Copyright 2014 The WebRTC Project Authors. All rights reserved.
+ *
+ *  Use of this source code is governed by a BSD-style license
+ *  that can be found in the LICENSE file in the root of the source
+ *  tree. An additional intellectual property rights grant can be found
+ *  in the file PATENTS.  All contributing project authors may
+ *  be found in the AUTHORS file in the root of the source tree.
+ */
+
+#import "ARDAppClient+Internal.h"
+
+#if defined(WEBRTC_IOS)
+#import "RTCAVFoundationVideoSource.h"
+#endif
+#import "RTCFileLogger.h"
+#import "RTCICEServer.h"
+#import "RTCLogging.h"
+#import "RTCMediaConstraints.h"
+#import "RTCMediaStream.h"
+#import "RTCPair.h"
+#import "RTCPeerConnectionInterface.h"
+#import "RTCVideoCapturer.h"
+
+#import "ARDAppEngineClient.h"
+#import "ARDCEODTURNClient.h"
+#import "ARDJoinResponse.h"
+#import "ARDMessageResponse.h"
+#import "ARDSDPUtils.h"
+#import "ARDSignalingMessage.h"
+#import "ARDUtilities.h"
+#import "ARDWebSocketChannel.h"
+#import "RTCICECandidate+JSON.h"
+#import "RTCSessionDescription+JSON.h"
+
+
+static NSString * const kARDDefaultSTUNServerUrl =
+    @"stun:stun.l.google.com:19302";
+// TODO(tkchin): figure out a better username for CEOD statistics.
+static NSString * const kARDTurnRequestUrl =
+    @"https://computeengineondemand.appspot.com"
+    @"/turn?username=iapprtc&key=4080218913";
+
+static NSString * const kARDAppClientErrorDomain = @"ARDAppClient";
+static NSInteger const kARDAppClientErrorUnknown = -1;
+static NSInteger const kARDAppClientErrorRoomFull = -2;
+static NSInteger const kARDAppClientErrorCreateSDP = -3;
+static NSInteger const kARDAppClientErrorSetSDP = -4;
+static NSInteger const kARDAppClientErrorInvalidClient = -5;
+static NSInteger const kARDAppClientErrorInvalidRoom = -6;
+
+@implementation ARDAppClient {
+  RTCFileLogger *_fileLogger;
+}
+
+@synthesize delegate = _delegate;
+@synthesize state = _state;
+@synthesize roomServerClient = _roomServerClient;
+@synthesize channel = _channel;
+@synthesize turnClient = _turnClient;
+@synthesize peerConnection = _peerConnection;
+@synthesize factory = _factory;
+@synthesize messageQueue = _messageQueue;
+@synthesize isTurnComplete = _isTurnComplete;
+@synthesize hasReceivedSdp  = _hasReceivedSdp;
+@synthesize roomId = _roomId;
+@synthesize clientId = _clientId;
+@synthesize isInitiator = _isInitiator;
+@synthesize iceServers = _iceServers;
+@synthesize webSocketURL = _websocketURL;
+@synthesize webSocketRestURL = _websocketRestURL;
+@synthesize defaultPeerConnectionConstraints =
+    _defaultPeerConnectionConstraints;
+
+- (instancetype)init {
+  if (self = [super init]) {
+    _roomServerClient = [[ARDAppEngineClient alloc] init];
+    NSURL *turnRequestURL = [NSURL URLWithString:kARDTurnRequestUrl];
+    _turnClient = [[ARDCEODTURNClient alloc] initWithURL:turnRequestURL];
+    [self configure];
+  }
+  return self;
+}
+
+- (instancetype)initWithDelegate:(id<ARDAppClientDelegate>)delegate {
+  if (self = [super init]) {
+    _roomServerClient = [[ARDAppEngineClient alloc] init];
+    _delegate = delegate;
+    NSURL *turnRequestURL = [NSURL URLWithString:kARDTurnRequestUrl];
+    _turnClient = [[ARDCEODTURNClient alloc] initWithURL:turnRequestURL];
+    [self configure];
+  }
+  return self;
+}
+
+// TODO(tkchin): Provide signaling channel factory interface so we can recreate
+// channel if we need to on network failure. Also, make this the default public
+// constructor.
+- (instancetype)initWithRoomServerClient:(id<ARDRoomServerClient>)rsClient
+                        signalingChannel:(id<ARDSignalingChannel>)channel
+                              turnClient:(id<ARDTURNClient>)turnClient
+                                delegate:(id<ARDAppClientDelegate>)delegate {
+  NSParameterAssert(rsClient);
+  NSParameterAssert(channel);
+  NSParameterAssert(turnClient);
+  if (self = [super init]) {
+    _roomServerClient = rsClient;
+    _channel = channel;
+    _turnClient = turnClient;
+    _delegate = delegate;
+    [self configure];
+  }
+  return self;
+}
+
+- (void)configure {
+  _factory = [[RTCPeerConnectionFactory alloc] init];
+  _messageQueue = [NSMutableArray array];
+  _iceServers = [NSMutableArray arrayWithObject:[self defaultSTUNServer]];
+  _fileLogger = [[RTCFileLogger alloc] init];
+  [_fileLogger start];
+}
+
+- (void)dealloc {
+  [self disconnect];
+}
+
+- (void)setState:(ARDAppClientState)state {
+  if (_state == state) {
+    return;
+  }
+  _state = state;
+  [_delegate appClient:self didChangeState:_state];
+}
+
+- (void)connectToRoomWithId:(NSString *)roomId
+                    options:(NSDictionary *)options {
+  NSParameterAssert(roomId.length);
+  NSParameterAssert(_state == kARDAppClientStateDisconnected);
+  self.state = kARDAppClientStateConnecting;
+
+  // Request TURN.
+  __weak ARDAppClient *weakSelf = self;
+  [_turnClient requestServersWithCompletionHandler:^(NSArray *turnServers,
+                                                     NSError *error) {
+    if (error) {
+      RTCLogError("Error retrieving TURN servers: %@",
+                  error.localizedDescription);
+    }
+    ARDAppClient *strongSelf = weakSelf;
+    [strongSelf.iceServers addObjectsFromArray:turnServers];
+    strongSelf.isTurnComplete = YES;
+    [strongSelf startSignalingIfReady];
+  }];
+
+  // Join room on room server.
+  [_roomServerClient joinRoomWithRoomId:roomId
+      completionHandler:^(ARDJoinResponse *response, NSError *error) {
+    ARDAppClient *strongSelf = weakSelf;
+    if (error) {
+      [strongSelf.delegate appClient:strongSelf didError:error];
+      return;
+    }
+    NSError *joinError =
+        [[strongSelf class] errorForJoinResultType:response.result];
+    if (joinError) {
+      RTCLogError(@"Failed to join room:%@ on room server.", roomId);
+      [strongSelf disconnect];
+      [strongSelf.delegate appClient:strongSelf didError:joinError];
+      return;
+    }
+    RTCLog(@"Joined room:%@ on room server.", roomId);
+    strongSelf.roomId = response.roomId;
+    strongSelf.clientId = response.clientId;
+    strongSelf.isInitiator = response.isInitiator;
+    for (ARDSignalingMessage *message in response.messages) {
+      if (message.type == kARDSignalingMessageTypeOffer ||
+          message.type == kARDSignalingMessageTypeAnswer) {
+        strongSelf.hasReceivedSdp = YES;
+        [strongSelf.messageQueue insertObject:message atIndex:0];
+      } else {
+        [strongSelf.messageQueue addObject:message];
+      }
+    }
+    strongSelf.webSocketURL = response.webSocketURL;
+    strongSelf.webSocketRestURL = response.webSocketRestURL;
+    [strongSelf registerWithColliderIfReady];
+    [strongSelf startSignalingIfReady];
+  }];
+}
+
+- (void)disconnect {
+  if (_state == kARDAppClientStateDisconnected) {
+    return;
+  }
+  if (self.hasJoinedRoomServerRoom) {
+    [_roomServerClient leaveRoomWithRoomId:_roomId
+                                  clientId:_clientId
+                         completionHandler:nil];
+  }
+  if (_channel) {
+    if (_channel.state == kARDSignalingChannelStateRegistered) {
+      // Tell the other client we're hanging up.
+      ARDByeMessage *byeMessage = [[ARDByeMessage alloc] init];
+      [_channel sendMessage:byeMessage];
+    }
+    // Disconnect from collider.
+    _channel = nil;
+  }
+  _clientId = nil;
+  _roomId = nil;
+  _isInitiator = NO;
+  _hasReceivedSdp = NO;
+  _messageQueue = [NSMutableArray array];
+  _peerConnection = nil;
+  self.state = kARDAppClientStateDisconnected;
+}
+
+#pragma mark - ARDSignalingChannelDelegate
+
+- (void)channel:(id<ARDSignalingChannel>)channel
+    didReceiveMessage:(ARDSignalingMessage *)message {
+  switch (message.type) {
+    case kARDSignalingMessageTypeOffer:
+    case kARDSignalingMessageTypeAnswer:
+      // Offers and answers must be processed before any other message, so we
+      // place them at the front of the queue.
+      _hasReceivedSdp = YES;
+      [_messageQueue insertObject:message atIndex:0];
+      break;
+    case kARDSignalingMessageTypeCandidate:
+      [_messageQueue addObject:message];
+      break;
+    case kARDSignalingMessageTypeBye:
+      // Disconnects can be processed immediately.
+      [self processSignalingMessage:message];
+      return;
+  }
+  [self drainMessageQueueIfReady];
+}
+
+- (void)channel:(id<ARDSignalingChannel>)channel
+    didChangeState:(ARDSignalingChannelState)state {
+  switch (state) {
+    case kARDSignalingChannelStateOpen:
+      break;
+    case kARDSignalingChannelStateRegistered:
+      break;
+    case kARDSignalingChannelStateClosed:
+    case kARDSignalingChannelStateError:
+      // TODO(tkchin): reconnection scenarios. Right now we just disconnect
+      // completely if the websocket connection fails.
+      [self disconnect];
+      break;
+  }
+}
+
+#pragma mark - RTCPeerConnectionDelegate
+// Callbacks for this delegate occur on non-main thread and need to be
+// dispatched back to main queue as needed.
+
+- (void)peerConnection:(RTCPeerConnection *)peerConnection
+    signalingStateChanged:(RTCSignalingState)stateChanged {
+  RTCLog(@"Signaling state changed: %d", stateChanged);
+}
+
+- (void)peerConnection:(RTCPeerConnection *)peerConnection
+           addedStream:(RTCMediaStream *)stream {
+  dispatch_async(dispatch_get_main_queue(), ^{
+    RTCLog(@"Received %lu video tracks and %lu audio tracks",
+        (unsigned long)stream.videoTracks.count,
+        (unsigned long)stream.audioTracks.count);
+    if (stream.videoTracks.count) {
+      RTCVideoTrack *videoTrack = stream.videoTracks[0];
+      [_delegate appClient:self didReceiveRemoteVideoTrack:videoTrack];
+    }
+  });
+}
+
+- (void)peerConnection:(RTCPeerConnection *)peerConnection
+        removedStream:(RTCMediaStream *)stream {
+  RTCLog(@"Stream was removed.");
+}
+
+- (void)peerConnectionOnRenegotiationNeeded:
+    (RTCPeerConnection *)peerConnection {
+  RTCLog(@"WARNING: Renegotiation needed but unimplemented.");
+}
+
+- (void)peerConnection:(RTCPeerConnection *)peerConnection
+    iceConnectionChanged:(RTCICEConnectionState)newState {
+  RTCLog(@"ICE state changed: %d", newState);
+  dispatch_async(dispatch_get_main_queue(), ^{
+    [_delegate appClient:self didChangeConnectionState:newState];
+  });
+}
+
+- (void)peerConnection:(RTCPeerConnection *)peerConnection
+    iceGatheringChanged:(RTCICEGatheringState)newState {
+  RTCLog(@"ICE gathering state changed: %d", newState);
+}
+
+- (void)peerConnection:(RTCPeerConnection *)peerConnection
+       gotICECandidate:(RTCICECandidate *)candidate {
+  dispatch_async(dispatch_get_main_queue(), ^{
+    ARDICECandidateMessage *message =
+        [[ARDICECandidateMessage alloc] initWithCandidate:candidate];
+    [self sendSignalingMessage:message];
+  });
+}
+
+- (void)peerConnection:(RTCPeerConnection*)peerConnection
+    didOpenDataChannel:(RTCDataChannel*)dataChannel {
+}
+
+#pragma mark - RTCSessionDescriptionDelegate
+// Callbacks for this delegate occur on non-main thread and need to be
+// dispatched back to main queue as needed.
+
+- (void)peerConnection:(RTCPeerConnection *)peerConnection
+    didCreateSessionDescription:(RTCSessionDescription *)sdp
+                          error:(NSError *)error {
+  dispatch_async(dispatch_get_main_queue(), ^{
+    if (error) {
+      RTCLogError(@"Failed to create session description. Error: %@", error);
+      [self disconnect];
+      NSDictionary *userInfo = @{
+        NSLocalizedDescriptionKey: @"Failed to create session description.",
+      };
+      NSError *sdpError =
+          [[NSError alloc] initWithDomain:kARDAppClientErrorDomain
+                                     code:kARDAppClientErrorCreateSDP
+                                 userInfo:userInfo];
+      [_delegate appClient:self didError:sdpError];
+      return;
+    }
+    // Prefer H264 if available.
+    RTCSessionDescription *sdpPreferringH264 =
+        [ARDSDPUtils descriptionForDescription:sdp
+                           preferredVideoCodec:@"H264"];
+    [_peerConnection setLocalDescriptionWithDelegate:self
+                                  sessionDescription:sdpPreferringH264];
+    ARDSessionDescriptionMessage *message =
+        [[ARDSessionDescriptionMessage alloc]
+            initWithDescription:sdpPreferringH264];
+    [self sendSignalingMessage:message];
+  });
+}
+
+- (void)peerConnection:(RTCPeerConnection *)peerConnection
+    didSetSessionDescriptionWithError:(NSError *)error {
+  dispatch_async(dispatch_get_main_queue(), ^{
+    if (error) {
+      RTCLogError(@"Failed to set session description. Error: %@", error);
+      [self disconnect];
+      NSDictionary *userInfo = @{
+        NSLocalizedDescriptionKey: @"Failed to set session description.",
+      };
+      NSError *sdpError =
+          [[NSError alloc] initWithDomain:kARDAppClientErrorDomain
+                                     code:kARDAppClientErrorSetSDP
+                                 userInfo:userInfo];
+      [_delegate appClient:self didError:sdpError];
+      return;
+    }
+    // If we're answering and we've just set the remote offer we need to create
+    // an answer and set the local description.
+    if (!_isInitiator && !_peerConnection.localDescription) {
+      RTCMediaConstraints *constraints = [self defaultAnswerConstraints];
+      [_peerConnection createAnswerWithDelegate:self
+                                    constraints:constraints];
+
+    }
+  });
+}
+
+#pragma mark - Private
+
+- (BOOL)hasJoinedRoomServerRoom {
+  return _clientId.length;
+}
+
+// Begins the peer connection connection process if we have both joined a room
+// on the room server and tried to obtain a TURN server. Otherwise does nothing.
+// A peer connection object will be created with a stream that contains local
+// audio and video capture. If this client is the caller, an offer is created as
+// well, otherwise the client will wait for an offer to arrive.
+- (void)startSignalingIfReady {
+  if (!_isTurnComplete || !self.hasJoinedRoomServerRoom) {
+    return;
+  }
+  self.state = kARDAppClientStateConnected;
+
+  // Create peer connection.
+  RTCMediaConstraints *constraints = [self defaultPeerConnectionConstraints];
+  RTCConfiguration *config = [[RTCConfiguration alloc] init];
+  config.iceServers = _iceServers;
+  _peerConnection = [_factory peerConnectionWithConfiguration:config
+                                                  constraints:constraints
+                                                     delegate:self];
+  // Create AV media stream and add it to the peer connection.
+  RTCMediaStream *localStream = [self createLocalMediaStream];
+  [_peerConnection addStream:localStream];
+  if (_isInitiator) {
+    // Send offer.
+    [_peerConnection createOfferWithDelegate:self
+                                 constraints:[self defaultOfferConstraints]];
+  } else {
+    // Check if we've received an offer.
+    [self drainMessageQueueIfReady];
+  }
+}
+
+// Processes the messages that we've received from the room server and the
+// signaling channel. The offer or answer message must be processed before other
+// signaling messages, however they can arrive out of order. Hence, this method
+// only processes pending messages if there is a peer connection object and
+// if we have received either an offer or answer.
+- (void)drainMessageQueueIfReady {
+  if (!_peerConnection || !_hasReceivedSdp) {
+    return;
+  }
+  for (ARDSignalingMessage *message in _messageQueue) {
+    [self processSignalingMessage:message];
+  }
+  [_messageQueue removeAllObjects];
+}
+
+// Processes the given signaling message based on its type.
+- (void)processSignalingMessage:(ARDSignalingMessage *)message {
+  NSParameterAssert(_peerConnection ||
+      message.type == kARDSignalingMessageTypeBye);
+  switch (message.type) {
+    case kARDSignalingMessageTypeOffer:
+    case kARDSignalingMessageTypeAnswer: {
+      ARDSessionDescriptionMessage *sdpMessage =
+          (ARDSessionDescriptionMessage *)message;
+      RTCSessionDescription *description = sdpMessage.sessionDescription;
+      // Prefer H264 if available.
+      RTCSessionDescription *sdpPreferringH264 =
+          [ARDSDPUtils descriptionForDescription:description
+                             preferredVideoCodec:@"H264"];
+      [_peerConnection setRemoteDescriptionWithDelegate:self
+                                     sessionDescription:sdpPreferringH264];
+      break;
+    }
+    case kARDSignalingMessageTypeCandidate: {
+      ARDICECandidateMessage *candidateMessage =
+          (ARDICECandidateMessage *)message;
+      [_peerConnection addICECandidate:candidateMessage.candidate];
+      break;
+    }
+    case kARDSignalingMessageTypeBye:
+      // Other client disconnected.
+      // TODO(tkchin): support waiting in room for next client. For now just
+      // disconnect.
+      [self disconnect];
+      break;
+  }
+}
+
+// Sends a signaling message to the other client. The caller will send messages
+// through the room server, whereas the callee will send messages over the
+// signaling channel.
+- (void)sendSignalingMessage:(ARDSignalingMessage *)message {
+  if (_isInitiator) {
+    __weak ARDAppClient *weakSelf = self;
+    [_roomServerClient sendMessage:message
+                         forRoomId:_roomId
+                          clientId:_clientId
+                 completionHandler:^(ARDMessageResponse *response,
+                                     NSError *error) {
+      ARDAppClient *strongSelf = weakSelf;
+      if (error) {
+        [strongSelf.delegate appClient:strongSelf didError:error];
+        return;
+      }
+      NSError *messageError =
+          [[strongSelf class] errorForMessageResultType:response.result];
+      if (messageError) {
+        [strongSelf.delegate appClient:strongSelf didError:messageError];
+        return;
+      }
+    }];
+  } else {
+    [_channel sendMessage:message];
+  }
+}
+
+- (RTCMediaStream *)createLocalMediaStream {
+  RTCMediaStream* localStream = [_factory mediaStreamWithLabel:@"ARDAMS"];
+  RTCVideoTrack* localVideoTrack = [self createLocalVideoTrack];
+  if (localVideoTrack) {
+    [localStream addVideoTrack:localVideoTrack];
+    [_delegate appClient:self didReceiveLocalVideoTrack:localVideoTrack];
+  }
+  [localStream addAudioTrack:[_factory audioTrackWithID:@"ARDAMSa0"]];
+  return localStream;
+}
+
+- (RTCVideoTrack *)createLocalVideoTrack {
+  RTCVideoTrack* localVideoTrack = nil;
+  // The iOS simulator doesn't provide any sort of camera capture
+  // support or emulation (http://goo.gl/rHAnC1) so don't bother
+  // trying to open a local stream.
+  // TODO(tkchin): local video capture for OSX. See
+  // https://code.google.com/p/webrtc/issues/detail?id=3417.
+#if !TARGET_IPHONE_SIMULATOR && TARGET_OS_IPHONE
+  RTCMediaConstraints *mediaConstraints = [self defaultMediaStreamConstraints];
+  RTCAVFoundationVideoSource *source =
+      [[RTCAVFoundationVideoSource alloc] initWithFactory:_factory
+                                              constraints:mediaConstraints];
+  localVideoTrack =
+      [[RTCVideoTrack alloc] initWithFactory:_factory
+                                      source:source
+                                     trackId:@"ARDAMSv0"];
+#endif
+  return localVideoTrack;
+}
+
+#pragma mark - Collider methods
+
+- (void)registerWithColliderIfReady {
+  if (!self.hasJoinedRoomServerRoom) {
+    return;
+  }
+  // Open WebSocket connection.
+  if (!_channel) {
+    _channel =
+        [[ARDWebSocketChannel alloc] initWithURL:_websocketURL
+                                         restURL:_websocketRestURL
+                                        delegate:self];
+  }
+  [_channel registerForRoomId:_roomId clientId:_clientId];
+}
+
+#pragma mark - Defaults
+
+- (RTCMediaConstraints *)defaultMediaStreamConstraints {
+  RTCMediaConstraints* constraints =
+      [[RTCMediaConstraints alloc]
+          initWithMandatoryConstraints:nil
+                   optionalConstraints:nil];
+  return constraints;
+}
+
+- (RTCMediaConstraints *)defaultAnswerConstraints {
+  return [self defaultOfferConstraints];
+}
+
+- (RTCMediaConstraints *)defaultOfferConstraints {
+  NSArray *mandatoryConstraints = @[
+      [[RTCPair alloc] initWithKey:@"OfferToReceiveAudio" value:@"true"],
+      [[RTCPair alloc] initWithKey:@"OfferToReceiveVideo" value:@"true"]
+  ];
+  RTCMediaConstraints* constraints =
+      [[RTCMediaConstraints alloc]
+          initWithMandatoryConstraints:mandatoryConstraints
+                   optionalConstraints:nil];
+  return constraints;
+}
+
+- (RTCMediaConstraints *)defaultPeerConnectionConstraints {
+  if (_defaultPeerConnectionConstraints) {
+    return _defaultPeerConnectionConstraints;
+  }
+  NSArray *optionalConstraints = @[
+      [[RTCPair alloc] initWithKey:@"DtlsSrtpKeyAgreement" value:@"true"]
+  ];
+  RTCMediaConstraints* constraints =
+      [[RTCMediaConstraints alloc]
+          initWithMandatoryConstraints:nil
+                   optionalConstraints:optionalConstraints];
+  return constraints;
+}
+
+- (RTCICEServer *)defaultSTUNServer {
+  NSURL *defaultSTUNServerURL = [NSURL URLWithString:kARDDefaultSTUNServerUrl];
+  return [[RTCICEServer alloc] initWithURI:defaultSTUNServerURL
+                                  username:@""
+                                  password:@""];
+}
+
+#pragma mark - Errors
+
++ (NSError *)errorForJoinResultType:(ARDJoinResultType)resultType {
+  NSError *error = nil;
+  switch (resultType) {
+    case kARDJoinResultTypeSuccess:
+      break;
+    case kARDJoinResultTypeUnknown: {
+      error = [[NSError alloc] initWithDomain:kARDAppClientErrorDomain
+                                         code:kARDAppClientErrorUnknown
+                                     userInfo:@{
+        NSLocalizedDescriptionKey: @"Unknown error.",
+      }];
+      break;
+    }
+    case kARDJoinResultTypeFull: {
+      error = [[NSError alloc] initWithDomain:kARDAppClientErrorDomain
+                                         code:kARDAppClientErrorRoomFull
+                                     userInfo:@{
+        NSLocalizedDescriptionKey: @"Room is full.",
+      }];
+      break;
+    }
+  }
+  return error;
+}
+
++ (NSError *)errorForMessageResultType:(ARDMessageResultType)resultType {
+  NSError *error = nil;
+  switch (resultType) {
+    case kARDMessageResultTypeSuccess:
+      break;
+    case kARDMessageResultTypeUnknown:
+      error = [[NSError alloc] initWithDomain:kARDAppClientErrorDomain
+                                         code:kARDAppClientErrorUnknown
+                                     userInfo:@{
+        NSLocalizedDescriptionKey: @"Unknown error.",
+      }];
+      break;
+    case kARDMessageResultTypeInvalidClient:
+      error = [[NSError alloc] initWithDomain:kARDAppClientErrorDomain
+                                         code:kARDAppClientErrorInvalidClient
+                                     userInfo:@{
+        NSLocalizedDescriptionKey: @"Invalid client.",
+      }];
+      break;
+    case kARDMessageResultTypeInvalidRoom:
+      error = [[NSError alloc] initWithDomain:kARDAppClientErrorDomain
+                                         code:kARDAppClientErrorInvalidRoom
+                                     userInfo:@{
+        NSLocalizedDescriptionKey: @"Invalid room.",
+      }];
+      break;
+  }
+  return error;
+}
+
+@end
diff --git a/examples/objc/AppRTCDemo/ARDAppEngineClient.h b/examples/objc/AppRTCDemo/ARDAppEngineClient.h
new file mode 100644
index 0000000..7514f36
--- /dev/null
+++ b/examples/objc/AppRTCDemo/ARDAppEngineClient.h
@@ -0,0 +1,14 @@
+/*
+ *  Copyright 2014 The WebRTC Project Authors. All rights reserved.
+ *
+ *  Use of this source code is governed by a BSD-style license
+ *  that can be found in the LICENSE file in the root of the source
+ *  tree. An additional intellectual property rights grant can be found
+ *  in the file PATENTS.  All contributing project authors may
+ *  be found in the AUTHORS file in the root of the source tree.
+ */
+
+#import "ARDRoomServerClient.h"
+
+@interface ARDAppEngineClient : NSObject <ARDRoomServerClient>
+@end
diff --git a/examples/objc/AppRTCDemo/ARDAppEngineClient.m b/examples/objc/AppRTCDemo/ARDAppEngineClient.m
new file mode 100644
index 0000000..4318e6b
--- /dev/null
+++ b/examples/objc/AppRTCDemo/ARDAppEngineClient.m
@@ -0,0 +1,165 @@
+/*
+ *  Copyright 2014 The WebRTC Project Authors. All rights reserved.
+ *
+ *  Use of this source code is governed by a BSD-style license
+ *  that can be found in the LICENSE file in the root of the source
+ *  tree. An additional intellectual property rights grant can be found
+ *  in the file PATENTS.  All contributing project authors may
+ *  be found in the AUTHORS file in the root of the source tree.
+ */
+
+#import "ARDAppEngineClient.h"
+
+#import "RTCLogging.h"
+
+#import "ARDJoinResponse.h"
+#import "ARDMessageResponse.h"
+#import "ARDSignalingMessage.h"
+#import "ARDUtilities.h"
+
+// TODO(tkchin): move these to a configuration object.
+static NSString * const kARDRoomServerHostUrl =
+    @"https://apprtc.appspot.com";
+static NSString * const kARDRoomServerJoinFormat =
+    @"https://apprtc.appspot.com/join/%@";
+static NSString * const kARDRoomServerMessageFormat =
+    @"https://apprtc.appspot.com/message/%@/%@";
+static NSString * const kARDRoomServerLeaveFormat =
+    @"https://apprtc.appspot.com/leave/%@/%@";
+
+static NSString * const kARDAppEngineClientErrorDomain = @"ARDAppEngineClient";
+static NSInteger const kARDAppEngineClientErrorBadResponse = -1;
+
+@implementation ARDAppEngineClient
+
+#pragma mark - ARDRoomServerClient
+
+- (void)joinRoomWithRoomId:(NSString *)roomId
+         completionHandler:(void (^)(ARDJoinResponse *response,
+                                     NSError *error))completionHandler {
+  NSParameterAssert(roomId.length);
+
+  NSString *urlString =
+      [NSString stringWithFormat:kARDRoomServerJoinFormat, roomId];
+  NSURL *roomURL = [NSURL URLWithString:urlString];
+  RTCLog(@"Joining room:%@ on room server.", roomId);
+  NSMutableURLRequest *request = [NSMutableURLRequest requestWithURL:roomURL];
+  request.HTTPMethod = @"POST";
+  __weak ARDAppEngineClient *weakSelf = self;
+  [NSURLConnection sendAsyncRequest:request
+                  completionHandler:^(NSURLResponse *response,
+                                      NSData *data,
+                                      NSError *error) {
+    ARDAppEngineClient *strongSelf = weakSelf;
+    if (error) {
+      if (completionHandler) {
+        completionHandler(nil, error);
+      }
+      return;
+    }
+    ARDJoinResponse *joinResponse =
+        [ARDJoinResponse responseFromJSONData:data];
+    if (!joinResponse) {
+      if (completionHandler) {
+        NSError *error = [[self class] badResponseError];
+        completionHandler(nil, error);
+      }
+      return;
+    }
+    if (completionHandler) {
+      completionHandler(joinResponse, nil);
+    }
+  }];
+}
+
+- (void)sendMessage:(ARDSignalingMessage *)message
+            forRoomId:(NSString *)roomId
+             clientId:(NSString *)clientId
+    completionHandler:(void (^)(ARDMessageResponse *response,
+                                NSError *error))completionHandler {
+  NSParameterAssert(message);
+  NSParameterAssert(roomId.length);
+  NSParameterAssert(clientId.length);
+
+  NSData *data = [message JSONData];
+  NSString *urlString =
+      [NSString stringWithFormat:
+          kARDRoomServerMessageFormat, roomId, clientId];
+  NSURL *url = [NSURL URLWithString:urlString];
+  RTCLog(@"C->RS POST: %@", message);
+  NSMutableURLRequest *request = [NSMutableURLRequest requestWithURL:url];
+  request.HTTPMethod = @"POST";
+  request.HTTPBody = data;
+  __weak ARDAppEngineClient *weakSelf = self;
+  [NSURLConnection sendAsyncRequest:request
+                  completionHandler:^(NSURLResponse *response,
+                                      NSData *data,
+                                      NSError *error) {
+    ARDAppEngineClient *strongSelf = weakSelf;
+    if (error) {
+      if (completionHandler) {
+        completionHandler(nil, error);
+      }
+      return;
+    }
+    ARDMessageResponse *messageResponse =
+        [ARDMessageResponse responseFromJSONData:data];
+    if (!messageResponse) {
+      if (completionHandler) {
+        NSError *error = [[self class] badResponseError];
+        completionHandler(nil, error);
+      }
+      return;
+    }
+    if (completionHandler) {
+      completionHandler(messageResponse, nil);
+    }
+  }];
+}
+
+- (void)leaveRoomWithRoomId:(NSString *)roomId
+                   clientId:(NSString *)clientId
+          completionHandler:(void (^)(NSError *error))completionHandler {
+  NSParameterAssert(roomId.length);
+  NSParameterAssert(clientId.length);
+
+  NSString *urlString =
+      [NSString stringWithFormat:kARDRoomServerLeaveFormat, roomId, clientId];
+  NSURL *url = [NSURL URLWithString:urlString];
+  NSMutableURLRequest *request = [NSMutableURLRequest requestWithURL:url];
+  request.HTTPMethod = @"POST";
+  NSURLResponse *response = nil;
+  NSError *error = nil;
+  // We want a synchronous request so that we know that we've left the room on
+  // room server before we do any further work.
+  RTCLog(@"C->RS: BYE");
+  [NSURLConnection sendSynchronousRequest:request
+                        returningResponse:&response
+                                    error:&error];
+  if (error) {
+    RTCLogError(@"Error leaving room %@ on room server: %@",
+          roomId, error.localizedDescription);
+    if (completionHandler) {
+      completionHandler(error);
+    }
+    return;
+  }
+  RTCLog(@"Left room:%@ on room server.", roomId);
+  if (completionHandler) {
+    completionHandler(nil);
+  }
+}
+
+#pragma mark - Private
+
++ (NSError *)badResponseError {
+  NSError *error =
+      [[NSError alloc] initWithDomain:kARDAppEngineClientErrorDomain
+                                 code:kARDAppEngineClientErrorBadResponse
+                             userInfo:@{
+    NSLocalizedDescriptionKey: @"Error parsing response.",
+  }];
+  return error;
+}
+
+@end
diff --git a/examples/objc/AppRTCDemo/ARDCEODTURNClient.h b/examples/objc/AppRTCDemo/ARDCEODTURNClient.h
new file mode 100644
index 0000000..9b136aa
--- /dev/null
+++ b/examples/objc/AppRTCDemo/ARDCEODTURNClient.h
@@ -0,0 +1,18 @@
+/*
+ *  Copyright 2014 The WebRTC Project Authors. All rights reserved.
+ *
+ *  Use of this source code is governed by a BSD-style license
+ *  that can be found in the LICENSE file in the root of the source
+ *  tree. An additional intellectual property rights grant can be found
+ *  in the file PATENTS.  All contributing project authors may
+ *  be found in the AUTHORS file in the root of the source tree.
+ */
+
+#import "ARDTURNClient.h"
+
+// Requests TURN server urls from compute engine on demand.
+@interface ARDCEODTURNClient : NSObject <ARDTURNClient>
+
+- (instancetype)initWithURL:(NSURL *)url;
+
+@end
diff --git a/examples/objc/AppRTCDemo/ARDCEODTURNClient.m b/examples/objc/AppRTCDemo/ARDCEODTURNClient.m
new file mode 100644
index 0000000..70f815a
--- /dev/null
+++ b/examples/objc/AppRTCDemo/ARDCEODTURNClient.m
@@ -0,0 +1,66 @@
+/*
+ *  Copyright 2014 The WebRTC Project Authors. All rights reserved.
+ *
+ *  Use of this source code is governed by a BSD-style license
+ *  that can be found in the LICENSE file in the root of the source
+ *  tree. An additional intellectual property rights grant can be found
+ *  in the file PATENTS.  All contributing project authors may
+ *  be found in the AUTHORS file in the root of the source tree.
+ */
+
+#import "ARDCEODTURNClient.h"
+
+#import "ARDUtilities.h"
+#import "RTCICEServer+JSON.h"
+
+// TODO(tkchin): move this to a configuration object.
+static NSString *kTURNOriginURLString = @"https://apprtc.appspot.com";
+static NSString *kARDCEODTURNClientErrorDomain = @"ARDCEODTURNClient";
+static NSInteger kARDCEODTURNClientErrorBadResponse = -1;
+
+@implementation ARDCEODTURNClient {
+  NSURL *_url;
+}
+
+- (instancetype)initWithURL:(NSURL *)url {
+  NSParameterAssert([url absoluteString].length);
+  if (self = [super init]) {
+    _url = url;
+  }
+  return self;
+}
+
+- (void)requestServersWithCompletionHandler:
+    (void (^)(NSArray *turnServers,
+              NSError *error))completionHandler {
+  NSMutableURLRequest *request = [NSMutableURLRequest requestWithURL:_url];
+  // We need to set origin because TURN provider whitelists requests based on
+  // origin.
+  [request addValue:@"Mozilla/5.0" forHTTPHeaderField:@"user-agent"];
+  [request addValue:kTURNOriginURLString forHTTPHeaderField:@"origin"];
+  [NSURLConnection sendAsyncRequest:request
+                  completionHandler:^(NSURLResponse *response,
+                                      NSData *data,
+                                      NSError *error) {
+    NSArray *turnServers = [NSArray array];
+    if (error) {
+      completionHandler(turnServers, error);
+      return;
+    }
+    NSDictionary *dict = [NSDictionary dictionaryWithJSONData:data];
+    turnServers = [RTCICEServer serversFromCEODJSONDictionary:dict];
+    if (!turnServers) {
+      NSError *responseError =
+          [[NSError alloc] initWithDomain:kARDCEODTURNClientErrorDomain
+                                     code:kARDCEODTURNClientErrorBadResponse
+                                 userInfo:@{
+            NSLocalizedDescriptionKey: @"Bad TURN response.",
+          }];
+      completionHandler(turnServers, responseError);
+      return;
+    }
+    completionHandler(turnServers, nil);
+  }];
+}
+
+@end
diff --git a/examples/objc/AppRTCDemo/ARDJoinResponse+Internal.h b/examples/objc/AppRTCDemo/ARDJoinResponse+Internal.h
new file mode 100644
index 0000000..b320299
--- /dev/null
+++ b/examples/objc/AppRTCDemo/ARDJoinResponse+Internal.h
@@ -0,0 +1,23 @@
+/*
+ *  Copyright 2014 The WebRTC Project Authors. All rights reserved.
+ *
+ *  Use of this source code is governed by a BSD-style license
+ *  that can be found in the LICENSE file in the root of the source
+ *  tree. An additional intellectual property rights grant can be found
+ *  in the file PATENTS.  All contributing project authors may
+ *  be found in the AUTHORS file in the root of the source tree.
+ */
+
+#import "ARDJoinResponse.h"
+
+@interface ARDJoinResponse ()
+
+@property(nonatomic, assign) ARDJoinResultType result;
+@property(nonatomic, assign) BOOL isInitiator;
+@property(nonatomic, strong) NSString *roomId;
+@property(nonatomic, strong) NSString *clientId;
+@property(nonatomic, strong) NSArray *messages;
+@property(nonatomic, strong) NSURL *webSocketURL;
+@property(nonatomic, strong) NSURL *webSocketRestURL;
+
+@end
diff --git a/examples/objc/AppRTCDemo/ARDJoinResponse.h b/examples/objc/AppRTCDemo/ARDJoinResponse.h
new file mode 100644
index 0000000..2911202
--- /dev/null
+++ b/examples/objc/AppRTCDemo/ARDJoinResponse.h
@@ -0,0 +1,32 @@
+/*
+ *  Copyright 2014 The WebRTC Project Authors. All rights reserved.
+ *
+ *  Use of this source code is governed by a BSD-style license
+ *  that can be found in the LICENSE file in the root of the source
+ *  tree. An additional intellectual property rights grant can be found
+ *  in the file PATENTS.  All contributing project authors may
+ *  be found in the AUTHORS file in the root of the source tree.
+ */
+
+#import <Foundation/Foundation.h>
+
+typedef NS_ENUM(NSInteger, ARDJoinResultType) {
+  kARDJoinResultTypeUnknown,
+  kARDJoinResultTypeSuccess,
+  kARDJoinResultTypeFull
+};
+
+// Result of joining a room on the room server.
+@interface ARDJoinResponse : NSObject
+
+@property(nonatomic, readonly) ARDJoinResultType result;
+@property(nonatomic, readonly) BOOL isInitiator;
+@property(nonatomic, readonly) NSString *roomId;
+@property(nonatomic, readonly) NSString *clientId;
+@property(nonatomic, readonly) NSArray *messages;
+@property(nonatomic, readonly) NSURL *webSocketURL;
+@property(nonatomic, readonly) NSURL *webSocketRestURL;
+
++ (ARDJoinResponse *)responseFromJSONData:(NSData *)data;
+
+@end
diff --git a/examples/objc/AppRTCDemo/ARDJoinResponse.m b/examples/objc/AppRTCDemo/ARDJoinResponse.m
new file mode 100644
index 0000000..b6c2be9
--- /dev/null
+++ b/examples/objc/AppRTCDemo/ARDJoinResponse.m
@@ -0,0 +1,82 @@
+/*
+ *  Copyright 2014 The WebRTC Project Authors. All rights reserved.
+ *
+ *  Use of this source code is governed by a BSD-style license
+ *  that can be found in the LICENSE file in the root of the source
+ *  tree. An additional intellectual property rights grant can be found
+ *  in the file PATENTS.  All contributing project authors may
+ *  be found in the AUTHORS file in the root of the source tree.
+ */
+
+#import "ARDJoinResponse+Internal.h"
+
+#import "ARDSignalingMessage.h"
+#import "ARDUtilities.h"
+#import "RTCICEServer+JSON.h"
+
+static NSString const *kARDJoinResultKey = @"result";
+static NSString const *kARDJoinResultParamsKey = @"params";
+static NSString const *kARDJoinInitiatorKey = @"is_initiator";
+static NSString const *kARDJoinRoomIdKey = @"room_id";
+static NSString const *kARDJoinClientIdKey = @"client_id";
+static NSString const *kARDJoinMessagesKey = @"messages";
+static NSString const *kARDJoinWebSocketURLKey = @"wss_url";
+static NSString const *kARDJoinWebSocketRestURLKey = @"wss_post_url";
+
+@implementation ARDJoinResponse
+
+@synthesize result = _result;
+@synthesize isInitiator = _isInitiator;
+@synthesize roomId = _roomId;
+@synthesize clientId = _clientId;
+@synthesize messages = _messages;
+@synthesize webSocketURL = _webSocketURL;
+@synthesize webSocketRestURL = _webSocketRestURL;
+
++ (ARDJoinResponse *)responseFromJSONData:(NSData *)data {
+  NSDictionary *responseJSON = [NSDictionary dictionaryWithJSONData:data];
+  if (!responseJSON) {
+    return nil;
+  }
+  ARDJoinResponse *response = [[ARDJoinResponse alloc] init];
+  NSString *resultString = responseJSON[kARDJoinResultKey];
+  response.result = [[self class] resultTypeFromString:resultString];
+  NSDictionary *params = responseJSON[kARDJoinResultParamsKey];
+
+  response.isInitiator = [params[kARDJoinInitiatorKey] boolValue];
+  response.roomId = params[kARDJoinRoomIdKey];
+  response.clientId = params[kARDJoinClientIdKey];
+
+  // Parse messages.
+  NSArray *messages = params[kARDJoinMessagesKey];
+  NSMutableArray *signalingMessages =
+      [NSMutableArray arrayWithCapacity:messages.count];
+  for (NSString *message in messages) {
+    ARDSignalingMessage *signalingMessage =
+        [ARDSignalingMessage messageFromJSONString:message];
+    [signalingMessages addObject:signalingMessage];
+  }
+  response.messages = signalingMessages;
+
+  // Parse websocket urls.
+  NSString *webSocketURLString = params[kARDJoinWebSocketURLKey];
+  response.webSocketURL = [NSURL URLWithString:webSocketURLString];
+  NSString *webSocketRestURLString = params[kARDJoinWebSocketRestURLKey];
+  response.webSocketRestURL = [NSURL URLWithString:webSocketRestURLString];
+
+  return response;
+}
+
+#pragma mark - Private
+
++ (ARDJoinResultType)resultTypeFromString:(NSString *)resultString {
+  ARDJoinResultType result = kARDJoinResultTypeUnknown;
+  if ([resultString isEqualToString:@"SUCCESS"]) {
+    result = kARDJoinResultTypeSuccess;
+  } else if ([resultString isEqualToString:@"FULL"]) {
+    result = kARDJoinResultTypeFull;
+  }
+  return result;
+}
+
+@end
diff --git a/examples/objc/AppRTCDemo/ARDMessageResponse+Internal.h b/examples/objc/AppRTCDemo/ARDMessageResponse+Internal.h
new file mode 100644
index 0000000..66ee761
--- /dev/null
+++ b/examples/objc/AppRTCDemo/ARDMessageResponse+Internal.h
@@ -0,0 +1,17 @@
+/*
+ *  Copyright 2014 The WebRTC Project Authors. All rights reserved.
+ *
+ *  Use of this source code is governed by a BSD-style license
+ *  that can be found in the LICENSE file in the root of the source
+ *  tree. An additional intellectual property rights grant can be found
+ *  in the file PATENTS.  All contributing project authors may
+ *  be found in the AUTHORS file in the root of the source tree.
+ */
+
+#import "ARDMessageResponse.h"
+
+@interface ARDMessageResponse ()
+
+@property(nonatomic, assign) ARDMessageResultType result;
+
+@end
diff --git a/examples/objc/AppRTCDemo/ARDMessageResponse.h b/examples/objc/AppRTCDemo/ARDMessageResponse.h
new file mode 100644
index 0000000..65468cd
--- /dev/null
+++ b/examples/objc/AppRTCDemo/ARDMessageResponse.h
@@ -0,0 +1,26 @@
+/*
+ *  Copyright 2014 The WebRTC Project Authors. All rights reserved.
+ *
+ *  Use of this source code is governed by a BSD-style license
+ *  that can be found in the LICENSE file in the root of the source
+ *  tree. An additional intellectual property rights grant can be found
+ *  in the file PATENTS.  All contributing project authors may
+ *  be found in the AUTHORS file in the root of the source tree.
+ */
+
+#import <Foundation/Foundation.h>
+
+typedef NS_ENUM(NSInteger, ARDMessageResultType) {
+  kARDMessageResultTypeUnknown,
+  kARDMessageResultTypeSuccess,
+  kARDMessageResultTypeInvalidRoom,
+  kARDMessageResultTypeInvalidClient
+};
+
+@interface ARDMessageResponse : NSObject
+
+@property(nonatomic, readonly) ARDMessageResultType result;
+
++ (ARDMessageResponse *)responseFromJSONData:(NSData *)data;
+
+@end
diff --git a/examples/objc/AppRTCDemo/ARDMessageResponse.m b/examples/objc/AppRTCDemo/ARDMessageResponse.m
new file mode 100644
index 0000000..0f5383f
--- /dev/null
+++ b/examples/objc/AppRTCDemo/ARDMessageResponse.m
@@ -0,0 +1,46 @@
+/*
+ *  Copyright 2014 The WebRTC Project Authors. All rights reserved.
+ *
+ *  Use of this source code is governed by a BSD-style license
+ *  that can be found in the LICENSE file in the root of the source
+ *  tree. An additional intellectual property rights grant can be found
+ *  in the file PATENTS.  All contributing project authors may
+ *  be found in the AUTHORS file in the root of the source tree.
+ */
+
+#import "ARDMessageResponse+Internal.h"
+
+#import "ARDUtilities.h"
+
+static NSString const *kARDMessageResultKey = @"result";
+
+@implementation ARDMessageResponse
+
+@synthesize result = _result;
+
++ (ARDMessageResponse *)responseFromJSONData:(NSData *)data {
+  NSDictionary *responseJSON = [NSDictionary dictionaryWithJSONData:data];
+  if (!responseJSON) {
+    return nil;
+  }
+  ARDMessageResponse *response = [[ARDMessageResponse alloc] init];
+  response.result =
+      [[self class] resultTypeFromString:responseJSON[kARDMessageResultKey]];
+  return response;
+}
+
+#pragma mark - Private
+
++ (ARDMessageResultType)resultTypeFromString:(NSString *)resultString {
+  ARDMessageResultType result = kARDMessageResultTypeUnknown;
+  if ([resultString isEqualToString:@"SUCCESS"]) {
+    result = kARDMessageResultTypeSuccess;
+  } else if ([resultString isEqualToString:@"INVALID_CLIENT"]) {
+    result = kARDMessageResultTypeInvalidClient;
+  } else if ([resultString isEqualToString:@"INVALID_ROOM"]) {
+    result = kARDMessageResultTypeInvalidRoom;
+  }
+  return result;
+}
+
+@end
diff --git a/examples/objc/AppRTCDemo/ARDRoomServerClient.h b/examples/objc/AppRTCDemo/ARDRoomServerClient.h
new file mode 100644
index 0000000..a9ff825
--- /dev/null
+++ b/examples/objc/AppRTCDemo/ARDRoomServerClient.h
@@ -0,0 +1,33 @@
+/*
+ *  Copyright 2014 The WebRTC Project Authors. All rights reserved.
+ *
+ *  Use of this source code is governed by a BSD-style license
+ *  that can be found in the LICENSE file in the root of the source
+ *  tree. An additional intellectual property rights grant can be found
+ *  in the file PATENTS.  All contributing project authors may
+ *  be found in the AUTHORS file in the root of the source tree.
+ */
+
+#import <Foundation/Foundation.h>
+
+@class ARDJoinResponse;
+@class ARDMessageResponse;
+@class ARDSignalingMessage;
+
+@protocol ARDRoomServerClient <NSObject>
+
+- (void)joinRoomWithRoomId:(NSString *)roomId
+         completionHandler:(void (^)(ARDJoinResponse *response,
+                                     NSError *error))completionHandler;
+
+- (void)sendMessage:(ARDSignalingMessage *)message
+            forRoomId:(NSString *)roomId
+             clientId:(NSString *)clientId
+    completionHandler:(void (^)(ARDMessageResponse *response,
+                                NSError *error))completionHandler;
+
+- (void)leaveRoomWithRoomId:(NSString *)roomId
+                   clientId:(NSString *)clientId
+          completionHandler:(void (^)(NSError *error))completionHandler;
+
+@end
diff --git a/examples/objc/AppRTCDemo/ARDSDPUtils.h b/examples/objc/AppRTCDemo/ARDSDPUtils.h
new file mode 100644
index 0000000..18795af
--- /dev/null
+++ b/examples/objc/AppRTCDemo/ARDSDPUtils.h
@@ -0,0 +1,24 @@
+/*
+ *  Copyright 2015 The WebRTC Project Authors. All rights reserved.
+ *
+ *  Use of this source code is governed by a BSD-style license
+ *  that can be found in the LICENSE file in the root of the source
+ *  tree. An additional intellectual property rights grant can be found
+ *  in the file PATENTS.  All contributing project authors may
+ *  be found in the AUTHORS file in the root of the source tree.
+ */
+
+#import <Foundation/Foundation.h>
+
+@class RTCSessionDescription;
+
+@interface ARDSDPUtils : NSObject
+
+// Updates the original SDP description to instead prefer the specified video
+// codec. We do this by placing the specified codec at the beginning of the
+// codec list if it exists in the sdp.
++ (RTCSessionDescription *)
+    descriptionForDescription:(RTCSessionDescription *)description
+          preferredVideoCodec:(NSString *)codec;
+
+@end
diff --git a/examples/objc/AppRTCDemo/ARDSDPUtils.m b/examples/objc/AppRTCDemo/ARDSDPUtils.m
new file mode 100644
index 0000000..498a001
--- /dev/null
+++ b/examples/objc/AppRTCDemo/ARDSDPUtils.m
@@ -0,0 +1,92 @@
+/*
+ *  Copyright 2015 The WebRTC Project Authors. All rights reserved.
+ *
+ *  Use of this source code is governed by a BSD-style license
+ *  that can be found in the LICENSE file in the root of the source
+ *  tree. An additional intellectual property rights grant can be found
+ *  in the file PATENTS.  All contributing project authors may
+ *  be found in the AUTHORS file in the root of the source tree.
+ */
+
+#import "ARDSDPUtils.h"
+
+#import "RTCLogging.h"
+#import "RTCSessionDescription.h"
+
+@implementation ARDSDPUtils
+
++ (RTCSessionDescription *)
+    descriptionForDescription:(RTCSessionDescription *)description
+          preferredVideoCodec:(NSString *)codec {
+  NSString *sdpString = description.description;
+  NSString *lineSeparator = @"\n";
+  NSString *mLineSeparator = @" ";
+  // Copied from PeerConnectionClient.java.
+  // TODO(tkchin): Move this to a shared C++ file.
+  NSMutableArray *lines =
+      [NSMutableArray arrayWithArray:
+          [sdpString componentsSeparatedByString:lineSeparator]];
+  NSInteger mLineIndex = -1;
+  NSString *codecRtpMap = nil;
+  // a=rtpmap:<payload type> <encoding name>/<clock rate>
+  // [/<encoding parameters>]
+  NSString *pattern =
+      [NSString stringWithFormat:@"^a=rtpmap:(\\d+) %@(/\\d+)+[\r]?$", codec];
+  NSRegularExpression *regex =
+      [NSRegularExpression regularExpressionWithPattern:pattern
+                                                options:0
+                                                  error:nil];
+  for (NSInteger i = 0; (i < lines.count) && (mLineIndex == -1 || !codecRtpMap);
+       ++i) {
+    NSString *line = lines[i];
+    if ([line hasPrefix:@"m=video"]) {
+      mLineIndex = i;
+      continue;
+    }
+    NSTextCheckingResult *codecMatches =
+        [regex firstMatchInString:line
+                          options:0
+                            range:NSMakeRange(0, line.length)];
+    if (codecMatches) {
+      codecRtpMap =
+          [line substringWithRange:[codecMatches rangeAtIndex:1]];
+      continue;
+    }
+  }
+  if (mLineIndex == -1) {
+    RTCLog(@"No m=video line, so can't prefer %@", codec);
+    return description;
+  }
+  if (!codecRtpMap) {
+    RTCLog(@"No rtpmap for %@", codec);
+    return description;
+  }
+  NSArray *origMLineParts =
+      [lines[mLineIndex] componentsSeparatedByString:mLineSeparator];
+  if (origMLineParts.count > 3) {
+    NSMutableArray *newMLineParts =
+        [NSMutableArray arrayWithCapacity:origMLineParts.count];
+    NSInteger origPartIndex = 0;
+    // Format is: m=<media> <port> <proto> <fmt> ...
+    [newMLineParts addObject:origMLineParts[origPartIndex++]];
+    [newMLineParts addObject:origMLineParts[origPartIndex++]];
+    [newMLineParts addObject:origMLineParts[origPartIndex++]];
+    [newMLineParts addObject:codecRtpMap];
+    for (; origPartIndex < origMLineParts.count; ++origPartIndex) {
+      if (![codecRtpMap isEqualToString:origMLineParts[origPartIndex]]) {
+        [newMLineParts addObject:origMLineParts[origPartIndex]];
+      }
+    }
+    NSString *newMLine =
+        [newMLineParts componentsJoinedByString:mLineSeparator];
+    [lines replaceObjectAtIndex:mLineIndex
+                     withObject:newMLine];
+  } else {
+    RTCLogWarning(@"Wrong SDP media description format: %@", lines[mLineIndex]);
+  }
+  NSString *mangledSdpString = [lines componentsJoinedByString:lineSeparator];
+  return [[RTCSessionDescription alloc] initWithType:description.type
+                                                 sdp:mangledSdpString];
+}
+
+@end
diff --git a/examples/objc/AppRTCDemo/ARDSignalingChannel.h b/examples/objc/AppRTCDemo/ARDSignalingChannel.h
new file mode 100644
index 0000000..70ba2ff
--- /dev/null
+++ b/examples/objc/AppRTCDemo/ARDSignalingChannel.h
@@ -0,0 +1,52 @@
+/*
+ *  Copyright 2014 The WebRTC Project Authors. All rights reserved.
+ *
+ *  Use of this source code is governed by a BSD-style license
+ *  that can be found in the LICENSE file in the root of the source
+ *  tree. An additional intellectual property rights grant can be found
+ *  in the file PATENTS.  All contributing project authors may
+ *  be found in the AUTHORS file in the root of the source tree.
+ */
+
+#import <Foundation/Foundation.h>
+
+#import "ARDSignalingMessage.h"
+
+typedef NS_ENUM(NSInteger, ARDSignalingChannelState) {
+  // State when disconnected.
+  kARDSignalingChannelStateClosed,
+  // State when connection is established but not ready for use.
+  kARDSignalingChannelStateOpen,
+  // State when connection is established and registered.
+  kARDSignalingChannelStateRegistered,
+  // State when connection encounters a fatal error.
+  kARDSignalingChannelStateError
+};
+
+@protocol ARDSignalingChannel;
+@protocol ARDSignalingChannelDelegate <NSObject>
+
+- (void)channel:(id<ARDSignalingChannel>)channel
+    didChangeState:(ARDSignalingChannelState)state;
+
+- (void)channel:(id<ARDSignalingChannel>)channel
+    didReceiveMessage:(ARDSignalingMessage *)message;
+
+@end
+
+@protocol ARDSignalingChannel <NSObject>
+
+@property(nonatomic, readonly) NSString *roomId;
+@property(nonatomic, readonly) NSString *clientId;
+@property(nonatomic, readonly) ARDSignalingChannelState state;
+@property(nonatomic, weak) id<ARDSignalingChannelDelegate> delegate;
+
+// Registers the channel for the given room and client id.
+- (void)registerForRoomId:(NSString *)roomId
+                 clientId:(NSString *)clientId;
+
+// Sends signaling message over the channel.
+- (void)sendMessage:(ARDSignalingMessage *)message;
+
+@end
+
diff --git a/examples/objc/AppRTCDemo/ARDSignalingMessage.h b/examples/objc/AppRTCDemo/ARDSignalingMessage.h
new file mode 100644
index 0000000..c33997f
--- /dev/null
+++ b/examples/objc/AppRTCDemo/ARDSignalingMessage.h
@@ -0,0 +1,49 @@
+/*
+ *  Copyright 2014 The WebRTC Project Authors. All rights reserved.
+ *
+ *  Use of this source code is governed by a BSD-style license
+ *  that can be found in the LICENSE file in the root of the source
+ *  tree. An additional intellectual property rights grant can be found
+ *  in the file PATENTS.  All contributing project authors may
+ *  be found in the AUTHORS file in the root of the source tree.
+ */
+
+#import <Foundation/Foundation.h>
+
+#import "RTCICECandidate.h"
+#import "RTCSessionDescription.h"
+
+typedef enum {
+  kARDSignalingMessageTypeCandidate,
+  kARDSignalingMessageTypeOffer,
+  kARDSignalingMessageTypeAnswer,
+  kARDSignalingMessageTypeBye,
+} ARDSignalingMessageType;
+
+@interface ARDSignalingMessage : NSObject
+
+@property(nonatomic, readonly) ARDSignalingMessageType type;
+
++ (ARDSignalingMessage *)messageFromJSONString:(NSString *)jsonString;
+- (NSData *)JSONData;
+
+@end
+
+@interface ARDICECandidateMessage : ARDSignalingMessage
+
+@property(nonatomic, readonly) RTCICECandidate *candidate;
+
+- (instancetype)initWithCandidate:(RTCICECandidate *)candidate;
+
+@end
+
+@interface ARDSessionDescriptionMessage : ARDSignalingMessage
+
+@property(nonatomic, readonly) RTCSessionDescription *sessionDescription;
+
+- (instancetype)initWithDescription:(RTCSessionDescription *)description;
+
+@end
+
+@interface ARDByeMessage : ARDSignalingMessage
+@end
diff --git a/examples/objc/AppRTCDemo/ARDSignalingMessage.m b/examples/objc/AppRTCDemo/ARDSignalingMessage.m
new file mode 100644
index 0000000..6a8d37f
--- /dev/null
+++ b/examples/objc/AppRTCDemo/ARDSignalingMessage.m
@@ -0,0 +1,128 @@
+/*
+ *  Copyright 2014 The WebRTC Project Authors. All rights reserved.
+ *
+ *  Use of this source code is governed by a BSD-style license
+ *  that can be found in the LICENSE file in the root of the source
+ *  tree. An additional intellectual property rights grant can be found
+ *  in the file PATENTS.  All contributing project authors may
+ *  be found in the AUTHORS file in the root of the source tree.
+ */
+
+#import "ARDSignalingMessage.h"
+
+#import "RTCLogging.h"
+
+#import "ARDUtilities.h"
+#import "RTCICECandidate+JSON.h"
+#import "RTCSessionDescription+JSON.h"
+
+static NSString const *kARDSignalingMessageTypeKey = @"type";
+
+@implementation ARDSignalingMessage
+
+@synthesize type = _type;
+
+- (instancetype)initWithType:(ARDSignalingMessageType)type {
+  if (self = [super init]) {
+    _type = type;
+  }
+  return self;
+}
+
+- (NSString *)description {
+  return [[NSString alloc] initWithData:[self JSONData]
+                               encoding:NSUTF8StringEncoding];
+}
+
++ (ARDSignalingMessage *)messageFromJSONString:(NSString *)jsonString {
+  NSDictionary *values = [NSDictionary dictionaryWithJSONString:jsonString];
+  if (!values) {
+    RTCLogError(@"Error parsing signaling message JSON.");
+    return nil;
+  }
+
+  NSString *typeString = values[kARDSignalingMessageTypeKey];
+  ARDSignalingMessage *message = nil;
+  if ([typeString isEqualToString:@"candidate"]) {
+    RTCICECandidate *candidate =
+        [RTCICECandidate candidateFromJSONDictionary:values];
+    message = [[ARDICECandidateMessage alloc] initWithCandidate:candidate];
+  } else if ([typeString isEqualToString:@"offer"] ||
+             [typeString isEqualToString:@"answer"]) {
+    RTCSessionDescription *description =
+        [RTCSessionDescription descriptionFromJSONDictionary:values];
+    message =
+        [[ARDSessionDescriptionMessage alloc] initWithDescription:description];
+  } else if ([typeString isEqualToString:@"bye"]) {
+    message = [[ARDByeMessage alloc] init];
+  } else {
+    RTCLogError(@"Unexpected type: %@", typeString);
+  }
+  return message;
+}
+
+- (NSData *)JSONData {
+  return nil;
+}
+
+@end
+
+@implementation ARDICECandidateMessage
+
+@synthesize candidate = _candidate;
+
+- (instancetype)initWithCandidate:(RTCICECandidate *)candidate {
+  if (self = [super initWithType:kARDSignalingMessageTypeCandidate]) {
+    _candidate = candidate;
+  }
+  return self;
+}
+
+- (NSData *)JSONData {
+  return [_candidate JSONData];
+}
+
+@end
+
+@implementation ARDSessionDescriptionMessage
+
+@synthesize sessionDescription = _sessionDescription;
+
+- (instancetype)initWithDescription:(RTCSessionDescription *)description {
+  ARDSignalingMessageType type = kARDSignalingMessageTypeOffer;
+  NSString *typeString = description.type;
+  if ([typeString isEqualToString:@"offer"]) {
+    type = kARDSignalingMessageTypeOffer;
+  } else if ([typeString isEqualToString:@"answer"]) {
+    type = kARDSignalingMessageTypeAnswer;
+  } else {
+    NSAssert(NO, @"Unexpected type: %@", typeString);
+  }
+  if (self = [super initWithType:type]) {
+    _sessionDescription = description;
+  }
+  return self;
+}
+
+- (NSData *)JSONData {
+  return [_sessionDescription JSONData];
+}
+
+@end
+
+@implementation ARDByeMessage
+
+- (instancetype)init {
+  return [super initWithType:kARDSignalingMessageTypeBye];
+}
+
+- (NSData *)JSONData {
+  NSDictionary *message = @{
+    @"type": @"bye"
+  };
+  return [NSJSONSerialization dataWithJSONObject:message
+                                         options:NSJSONWritingPrettyPrinted
+                                           error:NULL];
+}
+
+@end
diff --git a/examples/objc/AppRTCDemo/ARDTURNClient.h b/examples/objc/AppRTCDemo/ARDTURNClient.h
new file mode 100644
index 0000000..8f2a817
--- /dev/null
+++ b/examples/objc/AppRTCDemo/ARDTURNClient.h
@@ -0,0 +1,20 @@
+/*
+ *  Copyright 2014 The WebRTC Project Authors. All rights reserved.
+ *
+ *  Use of this source code is governed by a BSD-style license
+ *  that can be found in the LICENSE file in the root of the source
+ *  tree. An additional intellectual property rights grant can be found
+ *  in the file PATENTS.  All contributing project authors may
+ *  be found in the AUTHORS file in the root of the source tree.
+ */
+
+#import <Foundation/Foundation.h>
+
+@protocol ARDTURNClient <NSObject>
+
+// Returns TURN server urls if successful.
+- (void)requestServersWithCompletionHandler:
+    (void (^)(NSArray *turnServers,
+              NSError *error))completionHandler;
+
+@end
diff --git a/examples/objc/AppRTCDemo/ARDWebSocketChannel.h b/examples/objc/AppRTCDemo/ARDWebSocketChannel.h
new file mode 100644
index 0000000..2bd6264
--- /dev/null
+++ b/examples/objc/AppRTCDemo/ARDWebSocketChannel.h
@@ -0,0 +1,31 @@
+/*
+ *  Copyright 2014 The WebRTC Project Authors. All rights reserved.
+ *
+ *  Use of this source code is governed by a BSD-style license
+ *  that can be found in the LICENSE file in the root of the source
+ *  tree. An additional intellectual property rights grant can be found
+ *  in the file PATENTS.  All contributing project authors may
+ *  be found in the AUTHORS file in the root of the source tree.
+ */
+
+#import <Foundation/Foundation.h>
+
+#import "ARDSignalingChannel.h"
+
+// Wraps a WebSocket connection to the AppRTC WebSocket server.
+@interface ARDWebSocketChannel : NSObject <ARDSignalingChannel>
+
+- (instancetype)initWithURL:(NSURL *)url
+                    restURL:(NSURL *)restURL
+                   delegate:(id<ARDSignalingChannelDelegate>)delegate;
+
+// Registers with the WebSocket server for the given room and client id once
+// the web socket connection is open.
+- (void)registerForRoomId:(NSString *)roomId
+                 clientId:(NSString *)clientId;
+
+// Sends message over the WebSocket connection if registered, otherwise POSTs to
+// the web socket server instead.
+- (void)sendMessage:(ARDSignalingMessage *)message;
+
+@end
diff --git a/examples/objc/AppRTCDemo/ARDWebSocketChannel.m b/examples/objc/AppRTCDemo/ARDWebSocketChannel.m
new file mode 100644
index 0000000..395a22b
--- /dev/null
+++ b/examples/objc/AppRTCDemo/ARDWebSocketChannel.m
@@ -0,0 +1,199 @@
+/*
+ *  Copyright 2014 The WebRTC Project Authors. All rights reserved.
+ *
+ *  Use of this source code is governed by a BSD-style license
+ *  that can be found in the LICENSE file in the root of the source
+ *  tree. An additional intellectual property rights grant can be found
+ *  in the file PATENTS.  All contributing project authors may
+ *  be found in the AUTHORS file in the root of the source tree.
+ */
+
+#import "ARDWebSocketChannel.h"
+
+#import "RTCLogging.h"
+#import "SRWebSocket.h"
+
+#import "ARDUtilities.h"
+
+// TODO(tkchin): move these to a configuration object.
+static NSString const *kARDWSSMessageErrorKey = @"error";
+static NSString const *kARDWSSMessagePayloadKey = @"msg";
+
+@interface ARDWebSocketChannel () <SRWebSocketDelegate>
+@end
+
+@implementation ARDWebSocketChannel {
+  NSURL *_url;
+  NSURL *_restURL;
+  SRWebSocket *_socket;
+}
+
+@synthesize delegate = _delegate;
+@synthesize state = _state;
+@synthesize roomId = _roomId;
+@synthesize clientId = _clientId;
+
+- (instancetype)initWithURL:(NSURL *)url
+                    restURL:(NSURL *)restURL
+                   delegate:(id<ARDSignalingChannelDelegate>)delegate {
+  if (self = [super init]) {
+    _url = url;
+    _restURL = restURL;
+    _delegate = delegate;
+    _socket = [[SRWebSocket alloc] initWithURL:url];
+    _socket.delegate = self;
+    RTCLog(@"Opening WebSocket.");
+    [_socket open];
+  }
+  return self;
+}
+
+- (void)dealloc {
+  [self disconnect];
+}
+
+- (void)setState:(ARDSignalingChannelState)state {
+  if (_state == state) {
+    return;
+  }
+  _state = state;
+  [_delegate channel:self didChangeState:_state];
+}
+
+- (void)registerForRoomId:(NSString *)roomId
+                 clientId:(NSString *)clientId {
+  NSParameterAssert(roomId.length);
+  NSParameterAssert(clientId.length);
+  _roomId = roomId;
+  _clientId = clientId;
+  if (_state == kARDSignalingChannelStateOpen) {
+    [self registerWithCollider];
+  }
+}
+
+- (void)sendMessage:(ARDSignalingMessage *)message {
+  NSParameterAssert(_clientId.length);
+  NSParameterAssert(_roomId.length);
+  NSData *data = [message JSONData];
+  if (_state == kARDSignalingChannelStateRegistered) {
+    NSString *payload =
+        [[NSString alloc] initWithData:data encoding:NSUTF8StringEncoding];
+    NSDictionary *message = @{
+      @"cmd": @"send",
+      @"msg": payload,
+    };
+    NSData *messageJSONObject =
+        [NSJSONSerialization dataWithJSONObject:message
+                                        options:NSJSONWritingPrettyPrinted
+                                          error:nil];
+    NSString *messageString =
+        [[NSString alloc] initWithData:messageJSONObject
+                              encoding:NSUTF8StringEncoding];
+    RTCLog(@"C->WSS: %@", messageString);
+    [_socket send:messageString];
+  } else {
+    NSString *dataString =
+        [[NSString alloc] initWithData:data encoding:NSUTF8StringEncoding];
+    RTCLog(@"C->WSS POST: %@", dataString);
+    NSString *urlString =
+        [NSString stringWithFormat:@"%@/%@/%@",
+            [_restURL absoluteString], _roomId, _clientId];
+    NSURL *url = [NSURL URLWithString:urlString];
+    [NSURLConnection sendAsyncPostToURL:url
+                               withData:data
+                      completionHandler:nil];
+  }
+}
+
+- (void)disconnect {
+  if (_state == kARDSignalingChannelStateClosed ||
+      _state == kARDSignalingChannelStateError) {
+    return;
+  }
+  [_socket close];
+  RTCLog(@"C->WSS DELETE rid:%@ cid:%@", _roomId, _clientId);
+  NSString *urlString =
+      [NSString stringWithFormat:@"%@/%@/%@",
+          [_restURL absoluteString], _roomId, _clientId];
+  NSURL *url = [NSURL URLWithString:urlString];
+  NSMutableURLRequest *request = [NSMutableURLRequest requestWithURL:url];
+  request.HTTPMethod = @"DELETE";
+  request.HTTPBody = nil;
+  [NSURLConnection sendAsyncRequest:request completionHandler:nil];
+}
+
+#pragma mark - SRWebSocketDelegate
+
+- (void)webSocketDidOpen:(SRWebSocket *)webSocket {
+  RTCLog(@"WebSocket connection opened.");
+  self.state = kARDSignalingChannelStateOpen;
+  if (_roomId.length && _clientId.length) {
+    [self registerWithCollider];
+  }
+}
+
+- (void)webSocket:(SRWebSocket *)webSocket didReceiveMessage:(id)message {
+  NSString *messageString = message;
+  NSData *messageData = [messageString dataUsingEncoding:NSUTF8StringEncoding];
+  id jsonObject = [NSJSONSerialization JSONObjectWithData:messageData
+                                                  options:0
+                                                    error:nil];
+  if (![jsonObject isKindOfClass:[NSDictionary class]]) {
+    RTCLogError(@"Unexpected message: %@", jsonObject);
+    return;
+  }
+  NSDictionary *wssMessage = jsonObject;
+  NSString *errorString = wssMessage[kARDWSSMessageErrorKey];
+  if (errorString.length) {
+    RTCLogError(@"WSS error: %@", errorString);
+    return;
+  }
+  NSString *payload = wssMessage[kARDWSSMessagePayloadKey];
+  ARDSignalingMessage *signalingMessage =
+      [ARDSignalingMessage messageFromJSONString:payload];
+  RTCLog(@"WSS->C: %@", payload);
+  [_delegate channel:self didReceiveMessage:signalingMessage];
+}
+
+- (void)webSocket:(SRWebSocket *)webSocket didFailWithError:(NSError *)error {
+  RTCLogError(@"WebSocket error: %@", error);
+  self.state = kARDSignalingChannelStateError;
+}
+
+- (void)webSocket:(SRWebSocket *)webSocket
+    didCloseWithCode:(NSInteger)code
+              reason:(NSString *)reason
+            wasClean:(BOOL)wasClean {
+  RTCLog(@"WebSocket closed with code: %ld reason:%@ wasClean:%d",
+      (long)code, reason, wasClean);
+  NSParameterAssert(_state != kARDSignalingChannelStateError);
+  self.state = kARDSignalingChannelStateClosed;
+}
+
+#pragma mark - Private
+
+- (void)registerWithCollider {
+  if (_state == kARDSignalingChannelStateRegistered) {
+    return;
+  }
+  NSParameterAssert(_roomId.length);
+  NSParameterAssert(_clientId.length);
+  NSDictionary *registerMessage = @{
+    @"cmd": @"register",
+    @"roomid" : _roomId,
+    @"clientid" : _clientId,
+  };
+  NSData *message =
+      [NSJSONSerialization dataWithJSONObject:registerMessage
+                                      options:NSJSONWritingPrettyPrinted
+                                        error:nil];
+  NSString *messageString =
+      [[NSString alloc] initWithData:message encoding:NSUTF8StringEncoding];
+  RTCLog(@"Registering on WSS for rid:%@ cid:%@", _roomId, _clientId);
+  // Registration can fail if server rejects it. For example, if the room is
+  // full.
+  [_socket send:messageString];
+  self.state = kARDSignalingChannelStateRegistered;
+}
+
+@end
diff --git a/examples/objc/AppRTCDemo/RTCICECandidate+JSON.h b/examples/objc/AppRTCDemo/RTCICECandidate+JSON.h
new file mode 100644
index 0000000..8ef2748
--- /dev/null
+++ b/examples/objc/AppRTCDemo/RTCICECandidate+JSON.h
@@ -0,0 +1,18 @@
+/*
+ *  Copyright 2014 The WebRTC Project Authors. All rights reserved.
+ *
+ *  Use of this source code is governed by a BSD-style license
+ *  that can be found in the LICENSE file in the root of the source
+ *  tree. An additional intellectual property rights grant can be found
+ *  in the file PATENTS.  All contributing project authors may
+ *  be found in the AUTHORS file in the root of the source tree.
+ */
+
+#import "RTCICECandidate.h"
+
+@interface RTCICECandidate (JSON)
+
++ (RTCICECandidate *)candidateFromJSONDictionary:(NSDictionary *)dictionary;
+- (NSData *)JSONData;
+
+@end
diff --git a/examples/objc/AppRTCDemo/RTCICECandidate+JSON.m b/examples/objc/AppRTCDemo/RTCICECandidate+JSON.m
new file mode 100644
index 0000000..cf70b73
--- /dev/null
+++ b/examples/objc/AppRTCDemo/RTCICECandidate+JSON.m
@@ -0,0 +1,50 @@
+/*
+ *  Copyright 2014 The WebRTC Project Authors. All rights reserved.
+ *
+ *  Use of this source code is governed by a BSD-style license
+ *  that can be found in the LICENSE file in the root of the source
+ *  tree. An additional intellectual property rights grant can be found
+ *  in the file PATENTS.  All contributing project authors may
+ *  be found in the AUTHORS file in the root of the source tree.
+ */
+
+#import "RTCICECandidate+JSON.h"
+
+#import "RTCLogging.h"
+
+static NSString const *kRTCICECandidateTypeKey = @"type";
+static NSString const *kRTCICECandidateTypeValue = @"candidate";
+static NSString const *kRTCICECandidateMidKey = @"id";
+static NSString const *kRTCICECandidateMLineIndexKey = @"label";
+static NSString const *kRTCICECandidateSdpKey = @"candidate";
+
+@implementation RTCICECandidate (JSON)
+
++ (RTCICECandidate *)candidateFromJSONDictionary:(NSDictionary *)dictionary {
+  NSString *mid = dictionary[kRTCICECandidateMidKey];
+  NSString *sdp = dictionary[kRTCICECandidateSdpKey];
+  NSNumber *num = dictionary[kRTCICECandidateMLineIndexKey];
+  NSInteger mLineIndex = [num integerValue];
+  return [[RTCICECandidate alloc] initWithMid:mid index:mLineIndex sdp:sdp];
+}
+
+- (NSData *)JSONData {
+  NSDictionary *json = @{
+    kRTCICECandidateTypeKey : kRTCICECandidateTypeValue,
+    kRTCICECandidateMLineIndexKey : @(self.sdpMLineIndex),
+    kRTCICECandidateMidKey : self.sdpMid,
+    kRTCICECandidateSdpKey : self.sdp
+  };
+  NSError *error = nil;
+  NSData *data =
+      [NSJSONSerialization dataWithJSONObject:json
+                                      options:NSJSONWritingPrettyPrinted
+                                        error:&error];
+  if (error) {
+    RTCLogError(@"Error serializing JSON: %@", error);
+    return nil;
+  }
+  return data;
+}
+
+@end
diff --git a/examples/objc/AppRTCDemo/RTCICEServer+JSON.h b/examples/objc/AppRTCDemo/RTCICEServer+JSON.h
new file mode 100644
index 0000000..10339e7
--- /dev/null
+++ b/examples/objc/AppRTCDemo/RTCICEServer+JSON.h
@@ -0,0 +1,19 @@
+/*
+ *  Copyright 2014 The WebRTC Project Authors. All rights reserved.
+ *
+ *  Use of this source code is governed by a BSD-style license
+ *  that can be found in the LICENSE file in the root of the source
+ *  tree. An additional intellectual property rights grant can be found
+ *  in the file PATENTS.  All contributing project authors may
+ *  be found in the AUTHORS file in the root of the source tree.
+ */
+
+#import "RTCICEServer.h"
+
+@interface RTCICEServer (JSON)
+
++ (RTCICEServer *)serverFromJSONDictionary:(NSDictionary *)dictionary;
+// CEOD provides different JSON, and this parses that.
++ (NSArray *)serversFromCEODJSONDictionary:(NSDictionary *)dictionary;
+
+@end
diff --git a/examples/objc/AppRTCDemo/RTCICEServer+JSON.m b/examples/objc/AppRTCDemo/RTCICEServer+JSON.m
new file mode 100644
index 0000000..3ba8556
--- /dev/null
+++ b/examples/objc/AppRTCDemo/RTCICEServer+JSON.m
@@ -0,0 +1,47 @@
+/*
+ *  Copyright 2014 The WebRTC Project Authors. All rights reserved.
+ *
+ *  Use of this source code is governed by a BSD-style license
+ *  that can be found in the LICENSE file in the root of the source
+ *  tree. An additional intellectual property rights grant can be found
+ *  in the file PATENTS.  All contributing project authors may
+ *  be found in the AUTHORS file in the root of the source tree.
+ */
+
+#import "RTCICEServer+JSON.h"
+
+static NSString const *kRTCICEServerUsernameKey = @"username";
+static NSString const *kRTCICEServerPasswordKey = @"password";
+static NSString const *kRTCICEServerUrisKey = @"uris";
+static NSString const *kRTCICEServerUrlKey = @"urls";
+static NSString const *kRTCICEServerCredentialKey = @"credential";
+
+@implementation RTCICEServer (JSON)
+
++ (RTCICEServer *)serverFromJSONDictionary:(NSDictionary *)dictionary {
+  NSString *url = dictionary[kRTCICEServerUrlKey];
+  NSString *username = dictionary[kRTCICEServerUsernameKey];
+  NSString *credential = dictionary[kRTCICEServerCredentialKey];
+  username = username ? username : @"";
+  credential = credential ? credential : @"";
+  return [[RTCICEServer alloc] initWithURI:[NSURL URLWithString:url]
+                                  username:username
+                                  password:credential];
+}
+
++ (NSArray *)serversFromCEODJSONDictionary:(NSDictionary *)dictionary {
+  NSString *username = dictionary[kRTCICEServerUsernameKey];
+  NSString *password = dictionary[kRTCICEServerPasswordKey];
+  NSArray *uris = dictionary[kRTCICEServerUrisKey];
+  NSMutableArray *servers = [NSMutableArray arrayWithCapacity:uris.count];
+  for (NSString *uri in uris) {
+    RTCICEServer *server =
+        [[RTCICEServer alloc] initWithURI:[NSURL URLWithString:uri]
+                                 username:username
+                                 password:password];
+    [servers addObject:server];
+  }
+  return servers;
+}
+
+@end
diff --git a/examples/objc/AppRTCDemo/RTCMediaConstraints+JSON.h b/examples/objc/AppRTCDemo/RTCMediaConstraints+JSON.h
new file mode 100644
index 0000000..dd4813a
--- /dev/null
+++ b/examples/objc/AppRTCDemo/RTCMediaConstraints+JSON.h
@@ -0,0 +1,19 @@
+/*
+ *  Copyright 2014 The WebRTC Project Authors. All rights reserved.
+ *
+ *  Use of this source code is governed by a BSD-style license
+ *  that can be found in the LICENSE file in the root of the source
+ *  tree. An additional intellectual property rights grant can be found
+ *  in the file PATENTS.  All contributing project authors may
+ *  be found in the AUTHORS file in the root of the source tree.
+ */
+
+#import "RTCMediaConstraints.h"
+
+@interface RTCMediaConstraints (JSON)
+
++ (RTCMediaConstraints *)constraintsFromJSONDictionary:
+    (NSDictionary *)dictionary;
+
+@end
+
diff --git a/examples/objc/AppRTCDemo/RTCMediaConstraints+JSON.m b/examples/objc/AppRTCDemo/RTCMediaConstraints+JSON.m
new file mode 100644
index 0000000..b03773e
--- /dev/null
+++ b/examples/objc/AppRTCDemo/RTCMediaConstraints+JSON.m
@@ -0,0 +1,37 @@
+/*
+ *  Copyright 2014 The WebRTC Project Authors. All rights reserved.
+ *
+ *  Use of this source code is governed by a BSD-style license
+ *  that can be found in the LICENSE file in the root of the source
+ *  tree. An additional intellectual property rights grant can be found
+ *  in the file PATENTS.  All contributing project authors may
+ *  be found in the AUTHORS file in the root of the source tree.
+ */
+
+#import "RTCMediaConstraints+JSON.h"
+
+#import "RTCPair.h"
+
+static NSString const *kRTCMediaConstraintsMandatoryKey = @"mandatory";
+
+@implementation RTCMediaConstraints (JSON)
+
++ (RTCMediaConstraints *)constraintsFromJSONDictionary:
+    (NSDictionary *)dictionary {
+  NSDictionary *mandatory = dictionary[kRTCMediaConstraintsMandatoryKey];
+  NSMutableArray *mandatoryContraints =
+      [NSMutableArray arrayWithCapacity:[mandatory count]];
+  [mandatory enumerateKeysAndObjectsUsingBlock:^(
+      id key, id obj, BOOL *stop) {
+    [mandatoryContraints addObject:[[RTCPair alloc] initWithKey:key
+                                                          value:obj]];
+  }];
+  // TODO(tkchin): figure out json formats for optional constraints.
+  RTCMediaConstraints *constraints =
+      [[RTCMediaConstraints alloc]
+          initWithMandatoryConstraints:mandatoryContraints
+                   optionalConstraints:nil];
+  return constraints;
+}
+
+@end
diff --git a/examples/objc/AppRTCDemo/RTCSessionDescription+JSON.h b/examples/objc/AppRTCDemo/RTCSessionDescription+JSON.h
new file mode 100644
index 0000000..ee323a7
--- /dev/null
+++ b/examples/objc/AppRTCDemo/RTCSessionDescription+JSON.h
@@ -0,0 +1,19 @@
+/*
+ *  Copyright 2014 The WebRTC Project Authors. All rights reserved.
+ *
+ *  Use of this source code is governed by a BSD-style license
+ *  that can be found in the LICENSE file in the root of the source
+ *  tree. An additional intellectual property rights grant can be found
+ *  in the file PATENTS.  All contributing project authors may
+ *  be found in the AUTHORS file in the root of the source tree.
+ */
+
+#import "RTCSessionDescription.h"
+
+@interface RTCSessionDescription (JSON)
+
++ (RTCSessionDescription *)descriptionFromJSONDictionary:
+    (NSDictionary *)dictionary;
+- (NSData *)JSONData;
+
+@end
diff --git a/examples/objc/AppRTCDemo/RTCSessionDescription+JSON.m b/examples/objc/AppRTCDemo/RTCSessionDescription+JSON.m
new file mode 100644
index 0000000..b5655e0
--- /dev/null
+++ b/examples/objc/AppRTCDemo/RTCSessionDescription+JSON.m
@@ -0,0 +1,33 @@
+/*
+ *  Copyright 2014 The WebRTC Project Authors. All rights reserved.
+ *
+ *  Use of this source code is governed by a BSD-style license
+ *  that can be found in the LICENSE file in the root of the source
+ *  tree. An additional intellectual property rights grant can be found
+ *  in the file PATENTS.  All contributing project authors may
+ *  be found in the AUTHORS file in the root of the source tree.
+ */
+
+#import "RTCSessionDescription+JSON.h"
+
+static NSString const *kRTCSessionDescriptionTypeKey = @"type";
+static NSString const *kRTCSessionDescriptionSdpKey = @"sdp";
+
+@implementation RTCSessionDescription (JSON)
+
++ (RTCSessionDescription *)descriptionFromJSONDictionary:
+    (NSDictionary *)dictionary {
+  NSString *type = dictionary[kRTCSessionDescriptionTypeKey];
+  NSString *sdp = dictionary[kRTCSessionDescriptionSdpKey];
+  return [[RTCSessionDescription alloc] initWithType:type sdp:sdp];
+}
+
+- (NSData *)JSONData {
+  NSDictionary *json = @{
+    kRTCSessionDescriptionTypeKey : self.type,
+    kRTCSessionDescriptionSdpKey : self.description
+  };
+  return [NSJSONSerialization dataWithJSONObject:json options:0 error:nil];
+}
+
+@end
diff --git a/examples/objc/AppRTCDemo/common/ARDUtilities.h b/examples/objc/AppRTCDemo/common/ARDUtilities.h
new file mode 100644
index 0000000..6f94ef7
--- /dev/null
+++ b/examples/objc/AppRTCDemo/common/ARDUtilities.h
@@ -0,0 +1,35 @@
+/*
+ *  Copyright 2014 The WebRTC Project Authors. All rights reserved.
+ *
+ *  Use of this source code is governed by a BSD-style license
+ *  that can be found in the LICENSE file in the root of the source
+ *  tree. An additional intellectual property rights grant can be found
+ *  in the file PATENTS.  All contributing project authors may
+ *  be found in the AUTHORS file in the root of the source tree.
+ */
+
+#import <Foundation/Foundation.h>
+
+@interface NSDictionary (ARDUtilites)
+
+// Creates a dictionary with the keys and values in the JSON object.
++ (NSDictionary *)dictionaryWithJSONString:(NSString *)jsonString;
++ (NSDictionary *)dictionaryWithJSONData:(NSData *)jsonData;
+
+@end
+
+@interface NSURLConnection (ARDUtilities)
+
+// Issues an asynchronous request that calls back on main queue.
++ (void)sendAsyncRequest:(NSURLRequest *)request
+       completionHandler:(void (^)(NSURLResponse *response,
+                                   NSData *data,
+                                   NSError *error))completionHandler;
+
+// Posts data to the specified URL.
++ (void)sendAsyncPostToURL:(NSURL *)url
+                  withData:(NSData *)data
+         completionHandler:(void (^)(BOOL succeeded,
+                                     NSData *data))completionHandler;
+
+@end
diff --git a/examples/objc/AppRTCDemo/common/ARDUtilities.m b/examples/objc/AppRTCDemo/common/ARDUtilities.m
new file mode 100644
index 0000000..257b6a6
--- /dev/null
+++ b/examples/objc/AppRTCDemo/common/ARDUtilities.m
@@ -0,0 +1,95 @@
+/*
+ *  Copyright 2014 The WebRTC Project Authors. All rights reserved.
+ *
+ *  Use of this source code is governed by a BSD-style license
+ *  that can be found in the LICENSE file in the root of the source
+ *  tree. An additional intellectual property rights grant can be found
+ *  in the file PATENTS.  All contributing project authors may
+ *  be found in the AUTHORS file in the root of the source tree.
+ */
+
+#import "ARDUtilities.h"
+
+#import "RTCLogging.h"
+
+@implementation NSDictionary (ARDUtilites)
+
++ (NSDictionary *)dictionaryWithJSONString:(NSString *)jsonString {
+  NSParameterAssert(jsonString.length > 0);
+  NSData *data = [jsonString dataUsingEncoding:NSUTF8StringEncoding];
+  NSError *error = nil;
+  NSDictionary *dict =
+      [NSJSONSerialization JSONObjectWithData:data options:0 error:&error];
+  if (error) {
+    RTCLogError(@"Error parsing JSON: %@", error.localizedDescription);
+  }
+  return dict;
+}
+
++ (NSDictionary *)dictionaryWithJSONData:(NSData *)jsonData {
+  NSError *error = nil;
+  NSDictionary *dict =
+      [NSJSONSerialization JSONObjectWithData:jsonData options:0 error:&error];
+  if (error) {
+    RTCLogError(@"Error parsing JSON: %@", error.localizedDescription);
+  }
+  return dict;
+}
+
+@end
+
+@implementation NSURLConnection (ARDUtilities)
+
++ (void)sendAsyncRequest:(NSURLRequest *)request
+       completionHandler:(void (^)(NSURLResponse *response,
+                                   NSData *data,
+                                   NSError *error))completionHandler {
+  // Kick off an async request which will call back on main thread.
+  [NSURLConnection sendAsynchronousRequest:request
+                                     queue:[NSOperationQueue mainQueue]
+                         completionHandler:^(NSURLResponse *response,
+                                             NSData *data,
+                                             NSError *error) {
+    if (completionHandler) {
+      completionHandler(response, data, error);
+    }
+  }];
+}
+
+// Posts data to the specified URL.
++ (void)sendAsyncPostToURL:(NSURL *)url
+                  withData:(NSData *)data
+         completionHandler:(void (^)(BOOL succeeded,
+                                     NSData *data))completionHandler {
+  NSMutableURLRequest *request = [NSMutableURLRequest requestWithURL:url];
+  request.HTTPMethod = @"POST";
+  request.HTTPBody = data;
+  [[self class] sendAsyncRequest:request
+                completionHandler:^(NSURLResponse *response,
+                                    NSData *data,
+                                    NSError *error) {
+    if (error) {
+      RTCLogError(@"Error posting data: %@", error.localizedDescription);
+      if (completionHandler) {
+        completionHandler(NO, data);
+      }
+      return;
+    }
+    NSHTTPURLResponse *httpResponse = (NSHTTPURLResponse *)response;
+    if (httpResponse.statusCode != 200) {
+      NSString *serverResponse = data.length > 0 ?
+          [[NSString alloc] initWithData:data encoding:NSUTF8StringEncoding] :
+          nil;
+      RTCLogError(@"Received bad response: %@", serverResponse);
+      if (completionHandler) {
+        completionHandler(NO, data);
+      }
+      return;
+    }
+    if (completionHandler) {
+      completionHandler(YES, data);
+    }
+  }];
+}
+
+@end
diff --git a/examples/objc/AppRTCDemo/ios/ARDAppDelegate.h b/examples/objc/AppRTCDemo/ios/ARDAppDelegate.h
new file mode 100644
index 0000000..c73e8f2
--- /dev/null
+++ b/examples/objc/AppRTCDemo/ios/ARDAppDelegate.h
@@ -0,0 +1,17 @@
+/*
+ *  Copyright 2013 The WebRTC Project Authors. All rights reserved.
+ *
+ *  Use of this source code is governed by a BSD-style license
+ *  that can be found in the LICENSE file in the root of the source
+ *  tree. An additional intellectual property rights grant can be found
+ *  in the file PATENTS.  All contributing project authors may
+ *  be found in the AUTHORS file in the root of the source tree.
+ */
+
+#import <UIKit/UIKit.h>
+
+// The main application class of the AppRTCDemo iOS app demonstrating
+// interoperability between the Objective C implementation of PeerConnection
+// and the apprtc.appspot.com demo webapp.
+@interface ARDAppDelegate : NSObject <UIApplicationDelegate>
+@end
diff --git a/examples/objc/AppRTCDemo/ios/ARDAppDelegate.m b/examples/objc/AppRTCDemo/ios/ARDAppDelegate.m
new file mode 100644
index 0000000..0f4165e
--- /dev/null
+++ b/examples/objc/AppRTCDemo/ios/ARDAppDelegate.m
@@ -0,0 +1,52 @@
+/*
+ *  Copyright 2013 The WebRTC Project Authors. All rights reserved.
+ *
+ *  Use of this source code is governed by a BSD-style license
+ *  that can be found in the LICENSE file in the root of the source
+ *  tree. An additional intellectual property rights grant can be found
+ *  in the file PATENTS.  All contributing project authors may
+ *  be found in the AUTHORS file in the root of the source tree.
+ */
+
+#import "ARDAppDelegate.h"
+
+#import "RTCLogging.h"
+#import "RTCPeerConnectionFactory.h"
+
+#import "ARDMainViewController.h"
+
+@implementation ARDAppDelegate {
+  UIWindow *_window;
+}
+
+#pragma mark - UIApplicationDelegate methods
+
+- (BOOL)application:(UIApplication *)application
+    didFinishLaunchingWithOptions:(NSDictionary *)launchOptions {
+  [RTCPeerConnectionFactory initializeSSL];
+  _window =  [[UIWindow alloc] initWithFrame:[[UIScreen mainScreen] bounds]];
+  [_window makeKeyAndVisible];
+  ARDMainViewController *viewController = [[ARDMainViewController alloc] init];
+  _window.rootViewController = viewController;
+
+#ifndef _DEBUG
+  // In debug builds the default level is LS_INFO and in non-debug builds it is
+  // disabled. Continue to log to console in non-debug builds, but only
+  // warnings and errors.
+  RTCSetMinDebugLogLevel(kRTCLoggingSeverityWarning);
+#endif
+
+  return YES;
+}
+
+- (void)applicationWillResignActive:(UIApplication *)application {
+  ARDMainViewController *viewController =
+      (ARDMainViewController *)_window.rootViewController;
+  [viewController applicationWillResignActive:application];
+}
+
+- (void)applicationWillTerminate:(UIApplication *)application {
+  [RTCPeerConnectionFactory deinitializeSSL];
+}
+
+@end
diff --git a/examples/objc/AppRTCDemo/ios/ARDMainView.h b/examples/objc/AppRTCDemo/ios/ARDMainView.h
new file mode 100644
index 0000000..f091ad0
--- /dev/null
+++ b/examples/objc/AppRTCDemo/ios/ARDMainView.h
@@ -0,0 +1,27 @@
+/*
+ *  Copyright 2015 The WebRTC Project Authors. All rights reserved.
+ *
+ *  Use of this source code is governed by a BSD-style license
+ *  that can be found in the LICENSE file in the root of the source
+ *  tree. An additional intellectual property rights grant can be found
+ *  in the file PATENTS.  All contributing project authors may
+ *  be found in the AUTHORS file in the root of the source tree.
+ */
+
+#import <UIKit/UIKit.h>
+
+@class ARDMainView;
+
+@protocol ARDMainViewDelegate <NSObject>
+
+- (void)mainView:(ARDMainView *)mainView didInputRoom:(NSString *)room;
+
+@end
+
+// The main view of AppRTCDemo. It contains an input field for entering a room
+// name on apprtc to connect to.
+@interface ARDMainView : UIView
+
+@property(nonatomic, weak) id<ARDMainViewDelegate> delegate;
+
+@end
diff --git a/examples/objc/AppRTCDemo/ios/ARDMainView.m b/examples/objc/AppRTCDemo/ios/ARDMainView.m
new file mode 100644
index 0000000..295b59c
--- /dev/null
+++ b/examples/objc/AppRTCDemo/ios/ARDMainView.m
@@ -0,0 +1,168 @@
+/*
+ *  Copyright 2015 The WebRTC Project Authors. All rights reserved.
+ *
+ *  Use of this source code is governed by a BSD-style license
+ *  that can be found in the LICENSE file in the root of the source
+ *  tree. An additional intellectual property rights grant can be found
+ *  in the file PATENTS.  All contributing project authors may
+ *  be found in the AUTHORS file in the root of the source tree.
+ */
+
+#import "ARDMainView.h"
+
+#import "UIImage+ARDUtilities.h"
+
+// TODO(tkchin): retrieve status bar height dynamically.
+static CGFloat const kStatusBarHeight = 20;
+
+static CGFloat const kRoomTextButtonSize = 40;
+static CGFloat const kRoomTextFieldHeight = 40;
+static CGFloat const kRoomTextFieldMargin = 8;
+static CGFloat const kAppLabelHeight = 20;
+
+@class ARDRoomTextField;
+@protocol ARDRoomTextFieldDelegate <NSObject>
+- (void)roomTextField:(ARDRoomTextField *)roomTextField
+         didInputRoom:(NSString *)room;
+@end
+
+// Helper view that contains a text field and a clear button.
+@interface ARDRoomTextField : UIView <UITextFieldDelegate>
+@property(nonatomic, weak) id<ARDRoomTextFieldDelegate> delegate;
+@end
+
+@implementation ARDRoomTextField {
+  UITextField *_roomText;
+  UIButton *_clearButton;
+}
+
+@synthesize delegate = _delegate;
+
+- (instancetype)initWithFrame:(CGRect)frame {
+  if (self = [super initWithFrame:frame]) {
+    _roomText = [[UITextField alloc] initWithFrame:CGRectZero];
+    _roomText.borderStyle = UITextBorderStyleNone;
+    _roomText.font = [UIFont fontWithName:@"Roboto" size:12];
+    _roomText.placeholder = @"Room name";
+    _roomText.delegate = self;
+    [_roomText addTarget:self
+                  action:@selector(textFieldDidChange:)
+        forControlEvents:UIControlEventEditingChanged];
+    [self addSubview:_roomText];
+
+    _clearButton = [UIButton buttonWithType:UIButtonTypeCustom];
+    UIImage *image = [UIImage imageForName:@"ic_clear_black_24dp.png"
+                                     color:[UIColor colorWithWhite:0 alpha:.4]];
+
+    [_clearButton setImage:image forState:UIControlStateNormal];
+    [_clearButton addTarget:self
+                      action:@selector(onClear:)
+            forControlEvents:UIControlEventTouchUpInside];
+    _clearButton.hidden = YES;
+    [self addSubview:_clearButton];
+
+    // Give rounded corners and a light gray border.
+    self.layer.borderWidth = 1;
+    self.layer.borderColor = [[UIColor lightGrayColor] CGColor];
+    self.layer.cornerRadius = 2;
+  }
+  return self;
+}
+
+- (void)layoutSubviews {
+  CGRect bounds = self.bounds;
+  _clearButton.frame = CGRectMake(CGRectGetMaxX(bounds) - kRoomTextButtonSize,
+                                  CGRectGetMinY(bounds),
+                                  kRoomTextButtonSize,
+                                  kRoomTextButtonSize);
+  _roomText.frame = CGRectMake(
+      CGRectGetMinX(bounds) + kRoomTextFieldMargin,
+      CGRectGetMinY(bounds),
+      CGRectGetMinX(_clearButton.frame) - CGRectGetMinX(bounds) -
+          kRoomTextFieldMargin,
+      kRoomTextFieldHeight);
+}
+
+- (CGSize)sizeThatFits:(CGSize)size {
+  size.height = kRoomTextFieldHeight;
+  return size;
+}
+
+#pragma mark - UITextFieldDelegate
+
+- (void)textFieldDidEndEditing:(UITextField *)textField {
+  [_delegate roomTextField:self didInputRoom:textField.text];
+}
+
+- (BOOL)textFieldShouldReturn:(UITextField *)textField {
+  // There is no other control that can take focus, so manually resign focus
+  // when return (Join) is pressed to trigger |textFieldDidEndEditing|.
+  [textField resignFirstResponder];
+  return YES;
+}
+
+#pragma mark - Private
+
+- (void)textFieldDidChange:(id)sender {
+  [self updateClearButton];
+}
+
+- (void)onClear:(id)sender {
+  _roomText.text = @"";
+  [self updateClearButton];
+  [_roomText resignFirstResponder];
+}
+
+- (void)updateClearButton {
+  _clearButton.hidden = _roomText.text.length == 0;
+}
+
+@end
+
+@interface ARDMainView () <ARDRoomTextFieldDelegate>
+@end
+
+@implementation ARDMainView {
+  UILabel *_appLabel;
+  ARDRoomTextField *_roomText;
+}
+
+@synthesize delegate = _delegate;
+
+- (instancetype)initWithFrame:(CGRect)frame {
+  if (self = [super initWithFrame:frame]) {
+    _appLabel = [[UILabel alloc] initWithFrame:CGRectZero];
+    _appLabel.text = @"AppRTCDemo";
+    _appLabel.font = [UIFont fontWithName:@"Roboto" size:34];
+    _appLabel.textColor = [UIColor colorWithWhite:0 alpha:.2];
+    [_appLabel sizeToFit];
+    [self addSubview:_appLabel];
+
+    _roomText = [[ARDRoomTextField alloc] initWithFrame:CGRectZero];
+    _roomText.delegate = self;
+    [self addSubview:_roomText];
+
+    self.backgroundColor = [UIColor whiteColor];
+  }
+  return self;
+}
+
+- (void)layoutSubviews {
+  CGRect bounds = self.bounds;
+  CGFloat roomTextWidth = bounds.size.width - 2 * kRoomTextFieldMargin;
+  CGFloat roomTextHeight = [_roomText sizeThatFits:bounds.size].height;
+  _roomText.frame = CGRectMake(kRoomTextFieldMargin,
+                               kStatusBarHeight + kRoomTextFieldMargin,
+                               roomTextWidth,
+                               roomTextHeight);
+  _appLabel.center = CGPointMake(CGRectGetMidX(bounds), CGRectGetMidY(bounds));
+}
+
+#pragma mark - ARDRoomTextFieldDelegate
+
+- (void)roomTextField:(ARDRoomTextField *)roomTextField
+         didInputRoom:(NSString *)room {
+  [_delegate mainView:self didInputRoom:room];
+}
+
+@end
diff --git a/examples/objc/AppRTCDemo/ios/ARDMainViewController.h b/examples/objc/AppRTCDemo/ios/ARDMainViewController.h
new file mode 100644
index 0000000..cc38170
--- /dev/null
+++ b/examples/objc/AppRTCDemo/ios/ARDMainViewController.h
@@ -0,0 +1,17 @@
+/*
+ *  Copyright 2015 The WebRTC Project Authors. All rights reserved.
+ *
+ *  Use of this source code is governed by a BSD-style license
+ *  that can be found in the LICENSE file in the root of the source
+ *  tree. An additional intellectual property rights grant can be found
+ *  in the file PATENTS.  All contributing project authors may
+ *  be found in the AUTHORS file in the root of the source tree.
+ */
+
+#import <UIKit/UIKit.h>
+
+@interface ARDMainViewController : UIViewController
+
+- (void)applicationWillResignActive:(UIApplication *)application;
+
+@end
diff --git a/examples/objc/AppRTCDemo/ios/ARDMainViewController.m b/examples/objc/AppRTCDemo/ios/ARDMainViewController.m
new file mode 100644
index 0000000..3721fe9
--- /dev/null
+++ b/examples/objc/AppRTCDemo/ios/ARDMainViewController.m
@@ -0,0 +1,85 @@
+/*
+ *  Copyright 2015 The WebRTC Project Authors. All rights reserved.
+ *
+ *  Use of this source code is governed by a BSD-style license
+ *  that can be found in the LICENSE file in the root of the source
+ *  tree. An additional intellectual property rights grant can be found
+ *  in the file PATENTS.  All contributing project authors may
+ *  be found in the AUTHORS file in the root of the source tree.
+ */
+
+#import "ARDMainViewController.h"
+
+#import "ARDAppClient.h"
+#import "ARDMainView.h"
+#import "ARDVideoCallViewController.h"
+
+@interface ARDMainViewController () <ARDMainViewDelegate>
+@end
+
+@implementation ARDMainViewController
+
+- (void)loadView {
+  ARDMainView *mainView = [[ARDMainView alloc] initWithFrame:CGRectZero];
+  mainView.delegate = self;
+  self.view = mainView;
+}
+
+- (void)applicationWillResignActive:(UIApplication *)application {
+  // Terminate any calls when we aren't active.
+  [self dismissViewControllerAnimated:NO completion:nil];
+}
+
+#pragma mark - ARDMainViewDelegate
+
+- (void)mainView:(ARDMainView *)mainView didInputRoom:(NSString *)room {
+  if (!room.length) {
+    return;
+  }
+  // Trim whitespaces.
+  NSCharacterSet *whitespaceSet = [NSCharacterSet whitespaceCharacterSet];
+  NSString *trimmedRoom = [room stringByTrimmingCharactersInSet:whitespaceSet];
+
+  // Check that room name is valid.
+  NSError *error = nil;
+  NSRegularExpressionOptions options = NSRegularExpressionCaseInsensitive;
+  NSRegularExpression *regex =
+      [NSRegularExpression regularExpressionWithPattern:@"\\w+"
+                                                options:options
+                                                  error:&error];
+  if (error) {
+    [self showAlertWithMessage:error.localizedDescription];
+    return;
+  }
+  NSRange matchRange =
+      [regex rangeOfFirstMatchInString:trimmedRoom
+                               options:0
+                                 range:NSMakeRange(0, trimmedRoom.length)];
+  if (matchRange.location == NSNotFound ||
+      matchRange.length != trimmedRoom.length) {
+    [self showAlertWithMessage:@"Invalid room name."];
+    return;
+  }
+
+  // Kick off the video call.
+  ARDVideoCallViewController *videoCallViewController =
+      [[ARDVideoCallViewController alloc] initForRoom:trimmedRoom];
+  videoCallViewController.modalTransitionStyle =
+      UIModalTransitionStyleCrossDissolve;
+  [self presentViewController:videoCallViewController
+                     animated:YES
+                   completion:nil];
+}
+
+#pragma mark - Private
+
+- (void)showAlertWithMessage:(NSString*)message {
+  UIAlertView* alertView = [[UIAlertView alloc] initWithTitle:nil
+                                                      message:message
+                                                     delegate:nil
+                                            cancelButtonTitle:@"OK"
+                                            otherButtonTitles:nil];
+  [alertView show];
+}
+
+@end
diff --git a/examples/objc/AppRTCDemo/ios/ARDVideoCallView.h b/examples/objc/AppRTCDemo/ios/ARDVideoCallView.h
new file mode 100644
index 0000000..3208925
--- /dev/null
+++ b/examples/objc/AppRTCDemo/ios/ARDVideoCallView.h
@@ -0,0 +1,35 @@
+/*
+ *  Copyright 2015 The WebRTC Project Authors. All rights reserved.
+ *
+ *  Use of this source code is governed by a BSD-style license
+ *  that can be found in the LICENSE file in the root of the source
+ *  tree. An additional intellectual property rights grant can be found
+ *  in the file PATENTS.  All contributing project authors may
+ *  be found in the AUTHORS file in the root of the source tree.
+ */
+
+#import <UIKit/UIKit.h>
+
+#import "RTCEAGLVideoView.h"
+
+@class ARDVideoCallView;
+@protocol ARDVideoCallViewDelegate <NSObject>
+
+// Called when the camera switch button is pressed.
+- (void)videoCallViewDidSwitchCamera:(ARDVideoCallView *)view;
+
+// Called when the hangup button is pressed.
+- (void)videoCallViewDidHangup:(ARDVideoCallView *)view;
+
+@end
+
+// Video call view that shows local and remote video, provides a label to
+// display status, and also a hangup button.
+@interface ARDVideoCallView : UIView
+
+@property(nonatomic, readonly) UILabel *statusLabel;
+@property(nonatomic, readonly) RTCEAGLVideoView *localVideoView;
+@property(nonatomic, readonly) RTCEAGLVideoView *remoteVideoView;
+@property(nonatomic, weak) id<ARDVideoCallViewDelegate> delegate;
+
+@end
diff --git a/examples/objc/AppRTCDemo/ios/ARDVideoCallView.m b/examples/objc/AppRTCDemo/ios/ARDVideoCallView.m
new file mode 100644
index 0000000..45a69cf
--- /dev/null
+++ b/examples/objc/AppRTCDemo/ios/ARDVideoCallView.m
@@ -0,0 +1,162 @@
+/*
+ *  Copyright 2015 The WebRTC Project Authors. All rights reserved.
+ *
+ *  Use of this source code is governed by a BSD-style license
+ *  that can be found in the LICENSE file in the root of the source
+ *  tree. An additional intellectual property rights grant can be found
+ *  in the file PATENTS.  All contributing project authors may
+ *  be found in the AUTHORS file in the root of the source tree.
+ */
+
+#import "ARDVideoCallView.h"
+
+#import <AVFoundation/AVFoundation.h>
+#import "UIImage+ARDUtilities.h"
+
+static CGFloat const kButtonPadding = 16;
+static CGFloat const kButtonSize = 48;
+static CGFloat const kLocalVideoViewSize = 120;
+static CGFloat const kLocalVideoViewPadding = 8;
+
+@interface ARDVideoCallView () <RTCEAGLVideoViewDelegate>
+@end
+
+@implementation ARDVideoCallView {
+  UIButton *_cameraSwitchButton;
+  UIButton *_hangupButton;
+  CGSize _localVideoSize;
+  CGSize _remoteVideoSize;
+  BOOL _useRearCamera;
+}
+
+@synthesize statusLabel = _statusLabel;
+@synthesize localVideoView = _localVideoView;
+@synthesize remoteVideoView = _remoteVideoView;
+@synthesize delegate = _delegate;
+
+- (instancetype)initWithFrame:(CGRect)frame {
+  if (self = [super initWithFrame:frame]) {
+    _remoteVideoView = [[RTCEAGLVideoView alloc] initWithFrame:CGRectZero];
+    _remoteVideoView.delegate = self;
+    [self addSubview:_remoteVideoView];
+
+    // TODO(tkchin): replace this with a view that renders layer from
+    // AVCaptureSession.
+    _localVideoView = [[RTCEAGLVideoView alloc] initWithFrame:CGRectZero];
+    _localVideoView.delegate = self;
+    [self addSubview:_localVideoView];
+
+    // TODO(tkchin): don't display this if we can't actually do camera switch.
+    _cameraSwitchButton = [UIButton buttonWithType:UIButtonTypeCustom];
+    _cameraSwitchButton.backgroundColor = [UIColor whiteColor];
+    _cameraSwitchButton.layer.cornerRadius = kButtonSize / 2;
+    _cameraSwitchButton.layer.masksToBounds = YES;
+    UIImage *image = [UIImage imageNamed:@"ic_switch_video_black_24dp.png"];
+    [_cameraSwitchButton setImage:image forState:UIControlStateNormal];
+    [_cameraSwitchButton addTarget:self
+                      action:@selector(onCameraSwitch:)
+            forControlEvents:UIControlEventTouchUpInside];
+    [self addSubview:_cameraSwitchButton];
+
+    _hangupButton = [UIButton buttonWithType:UIButtonTypeCustom];
+    _hangupButton.backgroundColor = [UIColor redColor];
+    _hangupButton.layer.cornerRadius = kButtonSize / 2;
+    _hangupButton.layer.masksToBounds = YES;
+    image = [UIImage imageForName:@"ic_call_end_black_24dp.png"
+                            color:[UIColor whiteColor]];
+    [_hangupButton setImage:image forState:UIControlStateNormal];
+    [_hangupButton addTarget:self
+                      action:@selector(onHangup:)
+            forControlEvents:UIControlEventTouchUpInside];
+    [self addSubview:_hangupButton];
+
+    _statusLabel = [[UILabel alloc] initWithFrame:CGRectZero];
+    _statusLabel.font = [UIFont fontWithName:@"Roboto" size:16];
+    _statusLabel.textColor = [UIColor whiteColor];
+    [self addSubview:_statusLabel];
+  }
+  return self;
+}
+
+- (void)layoutSubviews {
+  CGRect bounds = self.bounds;
+  if (_remoteVideoSize.width > 0 && _remoteVideoSize.height > 0) {
+    // Aspect fill remote video into bounds.
+    CGRect remoteVideoFrame =
+        AVMakeRectWithAspectRatioInsideRect(_remoteVideoSize, bounds);
+    CGFloat scale = 1;
+    if (remoteVideoFrame.size.width > remoteVideoFrame.size.height) {
+      // Scale by height.
+      scale = bounds.size.height / remoteVideoFrame.size.height;
+    } else {
+      // Scale by width.
+      scale = bounds.size.width / remoteVideoFrame.size.width;
+    }
+    remoteVideoFrame.size.height *= scale;
+    remoteVideoFrame.size.width *= scale;
+    _remoteVideoView.frame = remoteVideoFrame;
+    _remoteVideoView.center =
+        CGPointMake(CGRectGetMidX(bounds), CGRectGetMidY(bounds));
+  } else {
+    _remoteVideoView.frame = bounds;
+  }
+
+  if (_localVideoSize.width && _localVideoSize.height > 0) {
+    // Aspect fit local video view into a square box.
+    CGRect localVideoFrame =
+        CGRectMake(0, 0, kLocalVideoViewSize, kLocalVideoViewSize);
+    localVideoFrame =
+        AVMakeRectWithAspectRatioInsideRect(_localVideoSize, localVideoFrame);
+
+    // Place the view in the bottom right.
+    localVideoFrame.origin.x = CGRectGetMaxX(bounds)
+        - localVideoFrame.size.width - kLocalVideoViewPadding;
+    localVideoFrame.origin.y = CGRectGetMaxY(bounds)
+        - localVideoFrame.size.height - kLocalVideoViewPadding;
+    _localVideoView.frame = localVideoFrame;
+  } else {
+    _localVideoView.frame = bounds;
+  }
+
+  // Place hangup button in the bottom left.
+  _hangupButton.frame =
+      CGRectMake(CGRectGetMinX(bounds) + kButtonPadding,
+                 CGRectGetMaxY(bounds) - kButtonPadding -
+                     kButtonSize,
+                 kButtonSize,
+                 kButtonSize);
+
+  // Place button to the right of hangup button.
+  CGRect cameraSwitchFrame = _hangupButton.frame;
+  cameraSwitchFrame.origin.x =
+      CGRectGetMaxX(cameraSwitchFrame) + kButtonPadding;
+  _cameraSwitchButton.frame = cameraSwitchFrame;
+
+  [_statusLabel sizeToFit];
+  _statusLabel.center =
+      CGPointMake(CGRectGetMidX(bounds), CGRectGetMidY(bounds));
+}
+
+#pragma mark - RTCEAGLVideoViewDelegate
+
+- (void)videoView:(RTCEAGLVideoView*)videoView didChangeVideoSize:(CGSize)size {
+  if (videoView == _localVideoView) {
+    _localVideoSize = size;
+    _localVideoView.hidden = CGSizeEqualToSize(CGSizeZero, _localVideoSize);
+  } else if (videoView == _remoteVideoView) {
+    _remoteVideoSize = size;
+  }
+  [self setNeedsLayout];
+}
+
+#pragma mark - Private
+
+- (void)onCameraSwitch:(id)sender {
+  [_delegate videoCallViewDidSwitchCamera:self];
+}
+
+- (void)onHangup:(id)sender {
+  [_delegate videoCallViewDidHangup:self];
+}
+
+@end
diff --git a/examples/objc/AppRTCDemo/ios/ARDVideoCallViewController.h b/examples/objc/AppRTCDemo/ios/ARDVideoCallViewController.h
new file mode 100644
index 0000000..9616da5
--- /dev/null
+++ b/examples/objc/AppRTCDemo/ios/ARDVideoCallViewController.h
@@ -0,0 +1,17 @@
+/*
+ *  Copyright 2015 The WebRTC Project Authors. All rights reserved.
+ *
+ *  Use of this source code is governed by a BSD-style license
+ *  that can be found in the LICENSE file in the root of the source
+ *  tree. An additional intellectual property rights grant can be found
+ *  in the file PATENTS.  All contributing project authors may
+ *  be found in the AUTHORS file in the root of the source tree.
+ */
+
+#import <UIKit/UIKit.h>
+
+@interface ARDVideoCallViewController : UIViewController
+
+- (instancetype)initForRoom:(NSString *)room;
+
+@end
diff --git a/examples/objc/AppRTCDemo/ios/ARDVideoCallViewController.m b/examples/objc/AppRTCDemo/ios/ARDVideoCallViewController.m
new file mode 100644
index 0000000..36c0902
--- /dev/null
+++ b/examples/objc/AppRTCDemo/ios/ARDVideoCallViewController.m
@@ -0,0 +1,177 @@
+/*
+ *  Copyright 2015 The WebRTC Project Authors. All rights reserved.
+ *
+ *  Use of this source code is governed by a BSD-style license
+ *  that can be found in the LICENSE file in the root of the source
+ *  tree. An additional intellectual property rights grant can be found
+ *  in the file PATENTS.  All contributing project authors may
+ *  be found in the AUTHORS file in the root of the source tree.
+ */
+
+#import "ARDVideoCallViewController.h"
+
+#import "RTCAVFoundationVideoSource.h"
+#import "RTCLogging.h"
+
+#import "ARDAppClient.h"
+#import "ARDVideoCallView.h"
+
+@interface ARDVideoCallViewController () <ARDAppClientDelegate,
+    ARDVideoCallViewDelegate>
+@property(nonatomic, strong) RTCVideoTrack *localVideoTrack;
+@property(nonatomic, strong) RTCVideoTrack *remoteVideoTrack;
+@property(nonatomic, readonly) ARDVideoCallView *videoCallView;
+@end
+
+@implementation ARDVideoCallViewController {
+  ARDAppClient *_client;
+  RTCVideoTrack *_remoteVideoTrack;
+  RTCVideoTrack *_localVideoTrack;
+}
+
+@synthesize videoCallView = _videoCallView;
+
+- (instancetype)initForRoom:(NSString *)room {
+  if (self = [super init]) {
+    _client = [[ARDAppClient alloc] initWithDelegate:self];
+    [_client connectToRoomWithId:room options:nil];
+  }
+  return self;
+}
+
+- (void)loadView {
+  _videoCallView = [[ARDVideoCallView alloc] initWithFrame:CGRectZero];
+  _videoCallView.delegate = self;
+  _videoCallView.statusLabel.text =
+      [self statusTextForState:RTCICEConnectionNew];
+  self.view = _videoCallView;
+}
+
+#pragma mark - ARDAppClientDelegate
+
+- (void)appClient:(ARDAppClient *)client
+    didChangeState:(ARDAppClientState)state {
+  switch (state) {
+    case kARDAppClientStateConnected:
+      RTCLog(@"Client connected.");
+      break;
+    case kARDAppClientStateConnecting:
+      RTCLog(@"Client connecting.");
+      break;
+    case kARDAppClientStateDisconnected:
+      RTCLog(@"Client disconnected.");
+      [self hangup];
+      break;
+  }
+}
+
+- (void)appClient:(ARDAppClient *)client
+    didChangeConnectionState:(RTCICEConnectionState)state {
+  RTCLog(@"ICE state changed: %d", state);
+  __weak ARDVideoCallViewController *weakSelf = self;
+  dispatch_async(dispatch_get_main_queue(), ^{
+    ARDVideoCallViewController *strongSelf = weakSelf;
+    strongSelf.videoCallView.statusLabel.text =
+        [strongSelf statusTextForState:state];
+  });
+}
+
+- (void)appClient:(ARDAppClient *)client
+    didReceiveLocalVideoTrack:(RTCVideoTrack *)localVideoTrack {
+  self.localVideoTrack = localVideoTrack;
+}
+
+- (void)appClient:(ARDAppClient *)client
+    didReceiveRemoteVideoTrack:(RTCVideoTrack *)remoteVideoTrack {
+  self.remoteVideoTrack = remoteVideoTrack;
+  _videoCallView.statusLabel.hidden = YES;
+}
+
+- (void)appClient:(ARDAppClient *)client
+         didError:(NSError *)error {
+  NSString *message =
+      [NSString stringWithFormat:@"%@", error.localizedDescription];
+  [self showAlertWithMessage:message];
+  [self hangup];
+}
+
+#pragma mark - ARDVideoCallViewDelegate
+
+- (void)videoCallViewDidHangup:(ARDVideoCallView *)view {
+  [self hangup];
+}
+
+- (void)videoCallViewDidSwitchCamera:(ARDVideoCallView *)view {
+  // TODO(tkchin): Rate limit this so you can't tap continously on it.
+  // Probably through an animation.
+  [self switchCamera];
+}
+
+#pragma mark - Private
+
+- (void)setLocalVideoTrack:(RTCVideoTrack *)localVideoTrack {
+  if (_localVideoTrack == localVideoTrack) {
+    return;
+  }
+  [_localVideoTrack removeRenderer:_videoCallView.localVideoView];
+  _localVideoTrack = nil;
+  [_videoCallView.localVideoView renderFrame:nil];
+  _localVideoTrack = localVideoTrack;
+  [_localVideoTrack addRenderer:_videoCallView.localVideoView];
+}
+
+- (void)setRemoteVideoTrack:(RTCVideoTrack *)remoteVideoTrack {
+  if (_remoteVideoTrack == remoteVideoTrack) {
+    return;
+  }
+  [_remoteVideoTrack removeRenderer:_videoCallView.localVideoView];
+  _remoteVideoTrack = nil;
+  [_videoCallView.remoteVideoView renderFrame:nil];
+  _remoteVideoTrack = remoteVideoTrack;
+  [_remoteVideoTrack addRenderer:_videoCallView.remoteVideoView];
+}
+
+- (void)hangup {
+  self.remoteVideoTrack = nil;
+  self.localVideoTrack = nil;
+  [_client disconnect];
+  if (![self isBeingDismissed]) {
+    [self.presentingViewController dismissViewControllerAnimated:YES
+                                                      completion:nil];
+  }
+}
+
+- (void)switchCamera {
+  RTCVideoSource* source = self.localVideoTrack.source;
+  if ([source isKindOfClass:[RTCAVFoundationVideoSource class]]) {
+    RTCAVFoundationVideoSource* avSource = (RTCAVFoundationVideoSource*)source;
+    avSource.useBackCamera = !avSource.useBackCamera;
+    _videoCallView.localVideoView.transform = avSource.useBackCamera ?
+        CGAffineTransformIdentity : CGAffineTransformMakeScale(-1, 1);
+  }
+}
+
+- (NSString *)statusTextForState:(RTCICEConnectionState)state {
+  switch (state) {
+    case RTCICEConnectionNew:
+    case RTCICEConnectionChecking:
+      return @"Connecting...";
+    case RTCICEConnectionConnected:
+    case RTCICEConnectionCompleted:
+    case RTCICEConnectionFailed:
+    case RTCICEConnectionDisconnected:
+    case RTCICEConnectionClosed:
+      return nil;
+  }
+}
+
+- (void)showAlertWithMessage:(NSString*)message {
+  UIAlertView* alertView = [[UIAlertView alloc] initWithTitle:nil
+                                                      message:message
+                                                     delegate:nil
+                                            cancelButtonTitle:@"OK"
+                                            otherButtonTitles:nil];
+  [alertView show];
+}
+
+@end
diff --git a/examples/objc/AppRTCDemo/ios/AppRTCDemo-Prefix.pch b/examples/objc/AppRTCDemo/ios/AppRTCDemo-Prefix.pch
new file mode 100644
index 0000000..6a5c375
--- /dev/null
+++ b/examples/objc/AppRTCDemo/ios/AppRTCDemo-Prefix.pch
@@ -0,0 +1,23 @@
+/*
+ *  Copyright 2013 The WebRTC Project Authors. All rights reserved.
+ *
+ *  Use of this source code is governed by a BSD-style license
+ *  that can be found in the LICENSE file in the root of the source
+ *  tree. An additional intellectual property rights grant can be found
+ *  in the file PATENTS.  All contributing project authors may
+ *  be found in the AUTHORS file in the root of the source tree.
+ */
+
+//
+// Prefix header for all source files of the 'AppRTCDemo' target in the
+// 'AppRTCDemo' project
+//
+
+#import <Availability.h>
+
+#if __IPHONE_OS_VERSION_MIN_REQUIRED < __IPHONE_6_0
+#warning "This project uses features only available in iOS SDK 6.0 and later."
+#endif
+
+#import <Foundation/Foundation.h>
+#import <UIKit/UIKit.h>
diff --git a/examples/objc/AppRTCDemo/ios/Info.plist b/examples/objc/AppRTCDemo/ios/Info.plist
new file mode 100644
index 0000000..fd1e26f
--- /dev/null
+++ b/examples/objc/AppRTCDemo/ios/Info.plist
@@ -0,0 +1,97 @@
+<?xml version="1.0" encoding="UTF-8"?>
+<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
+<plist version="1.0">
+<dict>
+  <key>BuildMachineOSBuild</key>
+  <string>12E55</string>
+  <key>CFBundleDevelopmentRegion</key>
+  <string>en</string>
+  <key>CFBundleDisplayName</key>
+  <string>AppRTCDemo</string>
+  <key>CFBundleExecutable</key>
+  <string>AppRTCDemo</string>
+  <key>CFBundleIcons</key>
+  <dict>
+    <key>CFBundlePrimaryIcon</key>
+    <dict>
+      <key>CFBundleIconFiles</key>
+      <array>
+        <string>Icon.png</string>
+      </array>
+    </dict>
+  </dict>
+  <key>CFBundleIdentifier</key>
+  <string>com.google.AppRTCDemo</string>
+  <key>CFBundleInfoDictionaryVersion</key>
+  <string>6.0</string>
+  <key>CFBundleName</key>
+  <string>AppRTCDemo</string>
+  <key>CFBundlePackageType</key>
+  <string>APPL</string>
+  <key>CFBundleShortVersionString</key>
+  <string>1.0</string>
+  <key>CFBundleSignature</key>
+  <string>????</string>
+  <key>CFBundleSupportedPlatforms</key>
+  <array>
+    <string>iPhoneOS</string>
+  </array>
+  <key>CFBundleVersion</key>
+  <string>1.0</string>
+  <key>UIStatusBarTintParameters</key>
+  <dict>
+    <key>UINavigationBar</key>
+    <dict>
+      <key>Style</key>
+      <string>UIBarStyleDefault</string>
+      <key>Translucent</key>
+      <false/>
+    </dict>
+  </dict>
+  <key>UISupportedInterfaceOrientations</key>
+  <array>
+    <string>UIInterfaceOrientationPortrait</string>
+  </array>
+  <key>UIAppFonts</key>
+  <array>
+    <string>Roboto-Regular.ttf</string>
+  </array>
+  <key>UIBackgroundModes</key>
+  <array>
+    <string>voip</string>
+  </array>
+  <key>UILaunchImages</key>
+  <array>
+    <dict>
+      <key>UILaunchImageMinimumOSVersion</key>
+      <string>7.0</string>
+      <key>UILaunchImageName</key>
+      <string>iPhone5</string>
+      <key>UILaunchImageOrientation</key>
+      <string>Portrait</string>
+      <key>UILaunchImageSize</key>
+      <string>{320, 568}</string>
+    </dict>
+    <dict>
+      <key>UILaunchImageMinimumOSVersion</key>
+      <string>8.0</string>
+      <key>UILaunchImageName</key>
+      <string>iPhone6</string>
+      <key>UILaunchImageOrientation</key>
+      <string>Portrait</string>
+      <key>UILaunchImageSize</key>
+      <string>{375, 667}</string>
+    </dict>
+    <dict>
+      <key>UILaunchImageMinimumOSVersion</key>
+      <string>8.0</string>
+      <key>UILaunchImageName</key>
+      <string>iPhone6p</string>
+      <key>UILaunchImageOrientation</key>
+      <string>Portrait</string>
+      <key>UILaunchImageSize</key>
+      <string>{414, 736}</string>
+    </dict>
+  </array>
+</dict>
+</plist>
diff --git a/examples/objc/AppRTCDemo/ios/UIImage+ARDUtilities.h b/examples/objc/AppRTCDemo/ios/UIImage+ARDUtilities.h
new file mode 100644
index 0000000..d56ba02
--- /dev/null
+++ b/examples/objc/AppRTCDemo/ios/UIImage+ARDUtilities.h
@@ -0,0 +1,18 @@
+/*
+ *  Copyright 2015 The WebRTC Project Authors. All rights reserved.
+ *
+ *  Use of this source code is governed by a BSD-style license
+ *  that can be found in the LICENSE file in the root of the source
+ *  tree. An additional intellectual property rights grant can be found
+ *  in the file PATENTS.  All contributing project authors may
+ *  be found in the AUTHORS file in the root of the source tree.
+ */
+
+#import <UIKit/UIKit.h>
+
+@interface UIImage (ARDUtilities)
+
+// Returns an color tinted version for the given image resource.
++ (UIImage *)imageForName:(NSString *)name color:(UIColor *)color;
+
+@end
diff --git a/examples/objc/AppRTCDemo/ios/UIImage+ARDUtilities.m b/examples/objc/AppRTCDemo/ios/UIImage+ARDUtilities.m
new file mode 100644
index 0000000..1bbe8c3
--- /dev/null
+++ b/examples/objc/AppRTCDemo/ios/UIImage+ARDUtilities.m
@@ -0,0 +1,31 @@
+/*
+ *  Copyright 2015 The WebRTC Project Authors. All rights reserved.
+ *
+ *  Use of this source code is governed by a BSD-style license
+ *  that can be found in the LICENSE file in the root of the source
+ *  tree. An additional intellectual property rights grant can be found
+ *  in the file PATENTS.  All contributing project authors may
+ *  be found in the AUTHORS file in the root of the source tree.
+ */
+
+#import "UIImage+ARDUtilities.h"
+
+@implementation UIImage (ARDUtilities)
+
++ (UIImage *)imageForName:(NSString *)name color:(UIColor *)color {
+  UIImage *image = [UIImage imageNamed:name];
+  if (!image) {
+    return nil;
+  }
+  UIGraphicsBeginImageContextWithOptions(image.size, NO, 0.0f);
+  [color setFill];
+  CGRect bounds = CGRectMake(0, 0, image.size.width, image.size.height);
+  UIRectFill(bounds);
+  [image drawInRect:bounds blendMode:kCGBlendModeDestinationIn alpha:1.0f];
+  UIImage *coloredImage = UIGraphicsGetImageFromCurrentImageContext();
+  UIGraphicsEndImageContext();
+
+  return coloredImage;
+}
+
+@end
diff --git a/examples/objc/AppRTCDemo/ios/main.m b/examples/objc/AppRTCDemo/ios/main.m
new file mode 100644
index 0000000..00b83f7
--- /dev/null
+++ b/examples/objc/AppRTCDemo/ios/main.m
@@ -0,0 +1,20 @@
+/*
+ *  Copyright 2013 The WebRTC Project Authors. All rights reserved.
+ *
+ *  Use of this source code is governed by a BSD-style license
+ *  that can be found in the LICENSE file in the root of the source
+ *  tree. An additional intellectual property rights grant can be found
+ *  in the file PATENTS.  All contributing project authors may
+ *  be found in the AUTHORS file in the root of the source tree.
+ */
+
+#import <UIKit/UIKit.h>
+
+#import "ARDAppDelegate.h"
+
+int main(int argc, char* argv[]) {
+  @autoreleasepool {
+    return UIApplicationMain(
+        argc, argv, nil, NSStringFromClass([ARDAppDelegate class]));
+  }
+}
diff --git a/examples/objc/AppRTCDemo/ios/resources/Roboto-Regular.ttf b/examples/objc/AppRTCDemo/ios/resources/Roboto-Regular.ttf
new file mode 100644
index 0000000..0e58508
--- /dev/null
+++ b/examples/objc/AppRTCDemo/ios/resources/Roboto-Regular.ttf
Binary files differ
diff --git a/examples/objc/AppRTCDemo/ios/resources/iPhone5@2x.png b/examples/objc/AppRTCDemo/ios/resources/iPhone5@2x.png
new file mode 100644
index 0000000..9d005fd
--- /dev/null
+++ b/examples/objc/AppRTCDemo/ios/resources/iPhone5@2x.png
Binary files differ
diff --git a/examples/objc/AppRTCDemo/ios/resources/iPhone6@2x.png b/examples/objc/AppRTCDemo/ios/resources/iPhone6@2x.png
new file mode 100644
index 0000000..fce3eb9
--- /dev/null
+++ b/examples/objc/AppRTCDemo/ios/resources/iPhone6@2x.png
Binary files differ
diff --git a/examples/objc/AppRTCDemo/ios/resources/iPhone6p@3x.png b/examples/objc/AppRTCDemo/ios/resources/iPhone6p@3x.png
new file mode 100644
index 0000000..aee20c2
--- /dev/null
+++ b/examples/objc/AppRTCDemo/ios/resources/iPhone6p@3x.png
Binary files differ
diff --git a/examples/objc/AppRTCDemo/ios/resources/ic_call_end_black_24dp.png b/examples/objc/AppRTCDemo/ios/resources/ic_call_end_black_24dp.png
new file mode 100755
index 0000000..531cb0f
--- /dev/null
+++ b/examples/objc/AppRTCDemo/ios/resources/ic_call_end_black_24dp.png
Binary files differ
diff --git a/examples/objc/AppRTCDemo/ios/resources/ic_call_end_black_24dp@2x.png b/examples/objc/AppRTCDemo/ios/resources/ic_call_end_black_24dp@2x.png
new file mode 100755
index 0000000..03dd381
--- /dev/null
+++ b/examples/objc/AppRTCDemo/ios/resources/ic_call_end_black_24dp@2x.png
Binary files differ
diff --git a/examples/objc/AppRTCDemo/ios/resources/ic_clear_black_24dp.png b/examples/objc/AppRTCDemo/ios/resources/ic_clear_black_24dp.png
new file mode 100755
index 0000000..4ebf8a2
--- /dev/null
+++ b/examples/objc/AppRTCDemo/ios/resources/ic_clear_black_24dp.png
Binary files differ
diff --git a/examples/objc/AppRTCDemo/ios/resources/ic_clear_black_24dp@2x.png b/examples/objc/AppRTCDemo/ios/resources/ic_clear_black_24dp@2x.png
new file mode 100755
index 0000000..ed2b252
--- /dev/null
+++ b/examples/objc/AppRTCDemo/ios/resources/ic_clear_black_24dp@2x.png
Binary files differ
diff --git a/examples/objc/AppRTCDemo/ios/resources/ic_switch_video_black_24dp.png b/examples/objc/AppRTCDemo/ios/resources/ic_switch_video_black_24dp.png
new file mode 100644
index 0000000..85271c8
--- /dev/null
+++ b/examples/objc/AppRTCDemo/ios/resources/ic_switch_video_black_24dp.png
Binary files differ
diff --git a/examples/objc/AppRTCDemo/ios/resources/ic_switch_video_black_24dp@2x.png b/examples/objc/AppRTCDemo/ios/resources/ic_switch_video_black_24dp@2x.png
new file mode 100644
index 0000000..62b13a6
--- /dev/null
+++ b/examples/objc/AppRTCDemo/ios/resources/ic_switch_video_black_24dp@2x.png
Binary files differ
diff --git a/examples/objc/AppRTCDemo/mac/APPRTCAppDelegate.h b/examples/objc/AppRTCDemo/mac/APPRTCAppDelegate.h
new file mode 100644
index 0000000..95f3594
--- /dev/null
+++ b/examples/objc/AppRTCDemo/mac/APPRTCAppDelegate.h
@@ -0,0 +1,14 @@
+/*
+ *  Copyright 2014 The WebRTC Project Authors. All rights reserved.
+ *
+ *  Use of this source code is governed by a BSD-style license
+ *  that can be found in the LICENSE file in the root of the source
+ *  tree. An additional intellectual property rights grant can be found
+ *  in the file PATENTS.  All contributing project authors may
+ *  be found in the AUTHORS file in the root of the source tree.
+ */
+
+#import <Cocoa/Cocoa.h>
+
+@interface APPRTCAppDelegate : NSObject<NSApplicationDelegate>
+@end
diff --git a/examples/objc/AppRTCDemo/mac/APPRTCAppDelegate.m b/examples/objc/AppRTCDemo/mac/APPRTCAppDelegate.m
new file mode 100644
index 0000000..16ccfc3
--- /dev/null
+++ b/examples/objc/AppRTCDemo/mac/APPRTCAppDelegate.m
@@ -0,0 +1,60 @@
+/*
+ *  Copyright 2014 The WebRTC Project Authors. All rights reserved.
+ *
+ *  Use of this source code is governed by a BSD-style license
+ *  that can be found in the LICENSE file in the root of the source
+ *  tree. An additional intellectual property rights grant can be found
+ *  in the file PATENTS.  All contributing project authors may
+ *  be found in the AUTHORS file in the root of the source tree.
+ */
+
+#if !defined(__has_feature) || !__has_feature(objc_arc)
+#error "This file requires ARC support."
+#endif
+
+#import "APPRTCAppDelegate.h"
+
+#import "APPRTCViewController.h"
+#import "RTCPeerConnectionFactory.h"
+
+@interface APPRTCAppDelegate () <NSWindowDelegate>
+@end
+
+@implementation APPRTCAppDelegate {
+  APPRTCViewController* _viewController;
+  NSWindow* _window;
+}
+
+#pragma mark - NSApplicationDelegate
+
+- (void)applicationDidFinishLaunching:(NSNotification*)notification {
+  [RTCPeerConnectionFactory initializeSSL];
+  NSScreen* screen = [NSScreen mainScreen];
+  NSRect visibleRect = [screen visibleFrame];
+  NSRect windowRect = NSMakeRect(NSMidX(visibleRect),
+                                 NSMidY(visibleRect),
+                                 1320,
+                                 1140);
+  NSUInteger styleMask = NSTitledWindowMask | NSClosableWindowMask;
+  _window = [[NSWindow alloc] initWithContentRect:windowRect
+                                        styleMask:styleMask
+                                          backing:NSBackingStoreBuffered
+                                            defer:NO];
+  _window.delegate = self;
+  [_window makeKeyAndOrderFront:self];
+  [_window makeMainWindow];
+  _viewController = [[APPRTCViewController alloc] initWithNibName:nil
+                                                           bundle:nil];
+  [_window setContentView:[_viewController view]];
+}
+
+#pragma mark - NSWindow
+
+- (void)windowWillClose:(NSNotification*)notification {
+  [_viewController windowWillClose:notification];
+  [RTCPeerConnectionFactory deinitializeSSL];
+  [NSApp terminate:self];
+}
+
+@end
+
diff --git a/examples/objc/AppRTCDemo/mac/APPRTCViewController.h b/examples/objc/AppRTCDemo/mac/APPRTCViewController.h
new file mode 100644
index 0000000..b4c94a8
--- /dev/null
+++ b/examples/objc/AppRTCDemo/mac/APPRTCViewController.h
@@ -0,0 +1,17 @@
+/*
+ *  Copyright 2014 The WebRTC Project Authors. All rights reserved.
+ *
+ *  Use of this source code is governed by a BSD-style license
+ *  that can be found in the LICENSE file in the root of the source
+ *  tree. An additional intellectual property rights grant can be found
+ *  in the file PATENTS.  All contributing project authors may
+ *  be found in the AUTHORS file in the root of the source tree.
+ */
+
+#import <Cocoa/Cocoa.h>
+
+@interface APPRTCViewController : NSViewController
+
+- (void)windowWillClose:(NSNotification*)notification;
+
+@end
diff --git a/examples/objc/AppRTCDemo/mac/APPRTCViewController.m b/examples/objc/AppRTCDemo/mac/APPRTCViewController.m
new file mode 100644
index 0000000..96ad7c9
--- /dev/null
+++ b/examples/objc/AppRTCDemo/mac/APPRTCViewController.m
@@ -0,0 +1,306 @@
+/*
+ *  Copyright 2014 The WebRTC Project Authors. All rights reserved.
+ *
+ *  Use of this source code is governed by a BSD-style license
+ *  that can be found in the LICENSE file in the root of the source
+ *  tree. An additional intellectual property rights grant can be found
+ *  in the file PATENTS.  All contributing project authors may
+ *  be found in the AUTHORS file in the root of the source tree.
+ */
+
+#import "APPRTCViewController.h"
+
+#import <AVFoundation/AVFoundation.h>
+#import "ARDAppClient.h"
+#import "RTCNSGLVideoView.h"
+#import "RTCVideoTrack.h"
+
+static NSUInteger const kContentWidth = 1280;
+static NSUInteger const kContentHeight = 720;
+static NSUInteger const kRoomFieldWidth = 80;
+static NSUInteger const kLogViewHeight = 280;
+
+@class APPRTCMainView;
+@protocol APPRTCMainViewDelegate
+
+- (void)appRTCMainView:(APPRTCMainView*)mainView
+        didEnterRoomId:(NSString*)roomId;
+
+@end
+
+@interface APPRTCMainView : NSView
+
+@property(nonatomic, weak) id<APPRTCMainViewDelegate> delegate;
+@property(nonatomic, readonly) RTCNSGLVideoView* localVideoView;
+@property(nonatomic, readonly) RTCNSGLVideoView* remoteVideoView;
+
+- (void)displayLogMessage:(NSString*)message;
+
+@end
+
+@interface APPRTCMainView () <NSTextFieldDelegate, RTCNSGLVideoViewDelegate>
+@end
+@implementation APPRTCMainView  {
+  NSScrollView* _scrollView;
+  NSTextField* _roomLabel;
+  NSTextField* _roomField;
+  NSTextView* _logView;
+  RTCNSGLVideoView* _localVideoView;
+  RTCNSGLVideoView* _remoteVideoView;
+  CGSize _localVideoSize;
+  CGSize _remoteVideoSize;
+}
+
++ (BOOL)requiresConstraintBasedLayout {
+  return YES;
+}
+
+- (instancetype)initWithFrame:(NSRect)frame {
+  if (self = [super initWithFrame:frame]) {
+    [self setupViews];
+  }
+  return self;
+}
+
+- (void)updateConstraints {
+  NSParameterAssert(
+      _roomField != nil && _scrollView != nil && _remoteVideoView != nil);
+  [self removeConstraints:[self constraints]];
+  NSDictionary* viewsDictionary =
+      NSDictionaryOfVariableBindings(_roomLabel,
+                                     _roomField,
+                                     _scrollView,
+                                     _remoteVideoView);
+
+  NSSize remoteViewSize = [self remoteVideoViewSize];
+  NSDictionary* metrics = @{
+    @"kLogViewHeight" : @(kLogViewHeight),
+    @"kRoomFieldWidth" : @(kRoomFieldWidth),
+    @"remoteViewWidth" : @(remoteViewSize.width),
+    @"remoteViewHeight" : @(remoteViewSize.height),
+  };
+  // Declare this separately to avoid compiler warning about splitting string
+  // within an NSArray expression.
+  NSString* verticalConstraint =
+      @"V:|-[_roomLabel]-[_roomField]-[_scrollView(kLogViewHeight)]"
+       "-[_remoteVideoView(remoteViewHeight)]-|";
+  NSArray* constraintFormats = @[
+      verticalConstraint,
+      @"|-[_roomLabel]",
+      @"|-[_roomField(kRoomFieldWidth)]",
+      @"|-[_scrollView(remoteViewWidth)]-|",
+      @"|-[_remoteVideoView(remoteViewWidth)]-|",
+  ];
+  for (NSString* constraintFormat in constraintFormats) {
+    NSArray* constraints =
+        [NSLayoutConstraint constraintsWithVisualFormat:constraintFormat
+                                                options:0
+                                                metrics:metrics
+                                                  views:viewsDictionary];
+    for (NSLayoutConstraint* constraint in constraints) {
+      [self addConstraint:constraint];
+    }
+  }
+  [super updateConstraints];
+}
+
+- (void)displayLogMessage:(NSString*)message {
+  _logView.string =
+      [NSString stringWithFormat:@"%@%@\n", _logView.string, message];
+  NSRange range = NSMakeRange([_logView.string length], 0);
+  [_logView scrollRangeToVisible:range];
+}
+
+#pragma mark - NSControl delegate
+
+- (void)controlTextDidEndEditing:(NSNotification*)notification {
+  NSDictionary* userInfo = [notification userInfo];
+  NSInteger textMovement = [userInfo[@"NSTextMovement"] intValue];
+  if (textMovement == NSReturnTextMovement) {
+    [self.delegate appRTCMainView:self didEnterRoomId:_roomField.stringValue];
+  }
+}
+
+#pragma mark - RTCNSGLVideoViewDelegate
+
+- (void)videoView:(RTCNSGLVideoView*)videoView
+    didChangeVideoSize:(NSSize)size {
+  if (videoView == _remoteVideoView) {
+    _remoteVideoSize = size;
+  } else if (videoView == _localVideoView) {
+    _localVideoSize = size;
+  } else {
+    return;
+  }
+  [self setNeedsUpdateConstraints:YES];
+}
+
+#pragma mark - Private
+
+- (void)setupViews {
+  NSParameterAssert([[self subviews] count] == 0);
+
+  _roomLabel = [[NSTextField alloc] initWithFrame:NSZeroRect];
+  [_roomLabel setTranslatesAutoresizingMaskIntoConstraints:NO];
+  [_roomLabel setBezeled:NO];
+  [_roomLabel setDrawsBackground:NO];
+  [_roomLabel setEditable:NO];
+  [_roomLabel setStringValue:@"Enter AppRTC room id:"];
+  [self addSubview:_roomLabel];
+
+  _roomField = [[NSTextField alloc] initWithFrame:NSZeroRect];
+  [_roomField setTranslatesAutoresizingMaskIntoConstraints:NO];
+  [self addSubview:_roomField];
+  [_roomField setEditable:YES];
+  [_roomField setDelegate:self];
+
+  _logView = [[NSTextView alloc] initWithFrame:NSZeroRect];
+  [_logView setMinSize:NSMakeSize(0, kLogViewHeight)];
+  [_logView setMaxSize:NSMakeSize(FLT_MAX, FLT_MAX)];
+  [_logView setVerticallyResizable:YES];
+  [_logView setAutoresizingMask:NSViewWidthSizable];
+  NSTextContainer* textContainer = [_logView textContainer];
+  NSSize containerSize = NSMakeSize(kContentWidth, FLT_MAX);
+  [textContainer setContainerSize:containerSize];
+  [textContainer setWidthTracksTextView:YES];
+  [_logView setEditable:NO];
+
+  _scrollView = [[NSScrollView alloc] initWithFrame:NSZeroRect];
+  [_scrollView setTranslatesAutoresizingMaskIntoConstraints:NO];
+  [_scrollView setHasVerticalScroller:YES];
+  [_scrollView setDocumentView:_logView];
+  [self addSubview:_scrollView];
+
+  NSOpenGLPixelFormatAttribute attributes[] = {
+    NSOpenGLPFADoubleBuffer,
+    NSOpenGLPFADepthSize, 24,
+    NSOpenGLPFAOpenGLProfile,
+    NSOpenGLProfileVersion3_2Core,
+    0
+  };
+  NSOpenGLPixelFormat* pixelFormat =
+      [[NSOpenGLPixelFormat alloc] initWithAttributes:attributes];
+  _remoteVideoView = [[RTCNSGLVideoView alloc] initWithFrame:NSZeroRect
+                                                 pixelFormat:pixelFormat];
+  [_remoteVideoView setTranslatesAutoresizingMaskIntoConstraints:NO];
+  _remoteVideoView.delegate = self;
+  [self addSubview:_remoteVideoView];
+
+  // TODO(tkchin): create local video view.
+  // https://code.google.com/p/webrtc/issues/detail?id=3417.
+}
+
+- (NSSize)remoteVideoViewSize {
+  if (_remoteVideoSize.width > 0 && _remoteVideoSize.height > 0) {
+    return _remoteVideoSize;
+  } else {
+    return NSMakeSize(kContentWidth, kContentHeight);
+  }
+}
+
+- (NSSize)localVideoViewSize {
+  return NSZeroSize;
+}
+
+@end
+
+@interface APPRTCViewController ()
+    <ARDAppClientDelegate, APPRTCMainViewDelegate>
+@property(nonatomic, readonly) APPRTCMainView* mainView;
+@end
+
+@implementation APPRTCViewController {
+  ARDAppClient* _client;
+  RTCVideoTrack* _localVideoTrack;
+  RTCVideoTrack* _remoteVideoTrack;
+}
+
+- (void)dealloc {
+  [self disconnect];
+}
+
+- (void)loadView {
+  APPRTCMainView* view = [[APPRTCMainView alloc] initWithFrame:NSZeroRect];
+  [view setTranslatesAutoresizingMaskIntoConstraints:NO];
+  view.delegate = self;
+  self.view = view;
+}
+
+- (void)windowWillClose:(NSNotification*)notification {
+  [self disconnect];
+}
+
+#pragma mark - ARDAppClientDelegate
+
+- (void)appClient:(ARDAppClient *)client
+    didChangeState:(ARDAppClientState)state {
+  switch (state) {
+    case kARDAppClientStateConnected:
+      NSLog(@"Client connected.");
+      break;
+    case kARDAppClientStateConnecting:
+      NSLog(@"Client connecting.");
+      break;
+    case kARDAppClientStateDisconnected:
+      NSLog(@"Client disconnected.");
+      [self resetUI];
+      _client = nil;
+      break;
+  }
+}
+
+- (void)appClient:(ARDAppClient *)client
+    didChangeConnectionState:(RTCICEConnectionState)state {
+}
+
+- (void)appClient:(ARDAppClient *)client
+    didReceiveLocalVideoTrack:(RTCVideoTrack *)localVideoTrack {
+  _localVideoTrack = localVideoTrack;
+}
+
+- (void)appClient:(ARDAppClient *)client
+    didReceiveRemoteVideoTrack:(RTCVideoTrack *)remoteVideoTrack {
+  _remoteVideoTrack = remoteVideoTrack;
+  [_remoteVideoTrack addRenderer:self.mainView.remoteVideoView];
+}
+
+- (void)appClient:(ARDAppClient *)client
+         didError:(NSError *)error {
+  [self showAlertWithMessage:[NSString stringWithFormat:@"%@", error]];
+  [self disconnect];
+}
+
+#pragma mark - APPRTCMainViewDelegate
+
+- (void)appRTCMainView:(APPRTCMainView*)mainView
+        didEnterRoomId:(NSString*)roomId {
+  [_client disconnect];
+  ARDAppClient *client = [[ARDAppClient alloc] initWithDelegate:self];
+  [client connectToRoomWithId:roomId options:nil];
+  _client = client;
+}
+
+#pragma mark - Private
+
+- (APPRTCMainView*)mainView {
+  return (APPRTCMainView*)self.view;
+}
+
+- (void)showAlertWithMessage:(NSString*)message {
+  NSAlert* alert = [[NSAlert alloc] init];
+  [alert setMessageText:message];
+  [alert runModal];
+}
+
+- (void)resetUI {
+  [_remoteVideoTrack removeRenderer:self.mainView.remoteVideoView];
+  _remoteVideoTrack = nil;
+  [self.mainView.remoteVideoView renderFrame:nil];
+}
+
+- (void)disconnect {
+  [self resetUI];
+  [_client disconnect];
+}
+
+@end
diff --git a/examples/objc/AppRTCDemo/mac/Info.plist b/examples/objc/AppRTCDemo/mac/Info.plist
new file mode 100644
index 0000000..4dcb240
--- /dev/null
+++ b/examples/objc/AppRTCDemo/mac/Info.plist
@@ -0,0 +1,29 @@
+<?xml version="1.0" encoding="UTF-8"?>
+
+<!DOCTYPE plist PUBLIC "-//Apple/DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
+<plist version="1.0">
+<dict>
+  <key>CFBundleDevelopmentRegion</key>
+  <string>en</string>
+  <key>CFBundleDisplayName</key>
+  <string>${PRODUCT_NAME}</string>
+  <key>CFBundleExecutable</key>
+  <string>${EXECUTABLE_NAME}</string>
+  <key>CFBundleIdentifier</key>
+  <string>com.Google.${PRODUCT_NAME:rfc1034identifier}</string>
+  <key>CFBundleInfoDictionaryVersion</key>
+  <string>6.0</string>
+  <key>CFBundleName</key>
+  <string>${PRODUCT_NAME}</string>
+  <key>CFBundlePackageType</key>
+  <string>APPL</string>
+  <key>CFBundleShortVersionString</key>
+  <string>1.0</string>
+  <key>CFBundleVersion</key>
+  <string>1.0</string>
+  <key>LSMinimumSystemVersion</key>
+  <string>${MACOSX_DEPLOYMENT_TARGET}</string>
+  <key>NSPrincipalClass</key>
+  <string>NSApplication</string>
+</dict>
+</plist>
\ No newline at end of file
diff --git a/examples/objc/AppRTCDemo/mac/main.m b/examples/objc/AppRTCDemo/mac/main.m
new file mode 100644
index 0000000..23153e6
--- /dev/null
+++ b/examples/objc/AppRTCDemo/mac/main.m
@@ -0,0 +1,22 @@
+/*
+ *  Copyright 2014 The WebRTC Project Authors. All rights reserved.
+ *
+ *  Use of this source code is governed by a BSD-style license
+ *  that can be found in the LICENSE file in the root of the source
+ *  tree. An additional intellectual property rights grant can be found
+ *  in the file PATENTS.  All contributing project authors may
+ *  be found in the AUTHORS file in the root of the source tree.
+ */
+
+#import <Cocoa/Cocoa.h>
+
+#import "APPRTCAppDelegate.h"
+
+int main(int argc, char* argv[]) {
+  @autoreleasepool {
+    [NSApplication sharedApplication];
+    APPRTCAppDelegate* delegate = [[APPRTCAppDelegate alloc] init];
+    [NSApp setDelegate:delegate];
+    [NSApp run];
+  }
+}
diff --git a/examples/objc/AppRTCDemo/tests/ARDAppClientTest.mm b/examples/objc/AppRTCDemo/tests/ARDAppClientTest.mm
new file mode 100644
index 0000000..b131931
--- /dev/null
+++ b/examples/objc/AppRTCDemo/tests/ARDAppClientTest.mm
@@ -0,0 +1,337 @@
+/*
+ *  Copyright 2014 The WebRTC Project Authors. All rights reserved.
+ *
+ *  Use of this source code is governed by a BSD-style license
+ *  that can be found in the LICENSE file in the root of the source
+ *  tree. An additional intellectual property rights grant can be found
+ *  in the file PATENTS.  All contributing project authors may
+ *  be found in the AUTHORS file in the root of the source tree.
+ */
+
+#import <Foundation/Foundation.h>
+#import <OCMock/OCMock.h>
+
+#import "ARDAppClient+Internal.h"
+#import "ARDJoinResponse+Internal.h"
+#import "ARDMessageResponse+Internal.h"
+#import "ARDSDPUtils.h"
+#import "RTCMediaConstraints.h"
+#import "RTCPeerConnectionFactory.h"
+#import "RTCSessionDescription.h"
+
+#include "webrtc/base/gunit.h"
+#include "webrtc/base/ssladapter.h"
+
+// These classes mimic XCTest APIs, to make eventual conversion to XCTest
+// easier. Conversion will happen once XCTest is supported well on build bots.
+@interface ARDTestExpectation : NSObject
+
+@property(nonatomic, readonly) NSString *description;
+@property(nonatomic, readonly) BOOL isFulfilled;
+
+- (instancetype)initWithDescription:(NSString *)description;
+- (void)fulfill;
+
+@end
+
+@implementation ARDTestExpectation
+
+@synthesize description = _description;
+@synthesize isFulfilled = _isFulfilled;
+
+- (instancetype)initWithDescription:(NSString *)description {
+  if (self = [super init]) {
+    _description = description;
+  }
+  return self;
+}
+
+- (void)fulfill {
+  _isFulfilled = YES;
+}
+
+@end
+
+@interface ARDTestCase : NSObject
+
+- (ARDTestExpectation *)expectationWithDescription:(NSString *)description;
+- (void)waitForExpectationsWithTimeout:(NSTimeInterval)timeout
+                               handler:(void (^)(NSError *error))handler;
+
+@end
+
+@implementation ARDTestCase {
+  NSMutableArray *_expectations;
+}
+
+- (instancetype)init {
+  if (self = [super init]) {
+   _expectations = [NSMutableArray array];
+  }
+  return self;
+}
+
+- (ARDTestExpectation *)expectationWithDescription:(NSString *)description {
+  ARDTestExpectation *expectation =
+      [[ARDTestExpectation alloc] initWithDescription:description];
+  [_expectations addObject:expectation];
+  return expectation;
+}
+
+- (void)waitForExpectationsWithTimeout:(NSTimeInterval)timeout
+                               handler:(void (^)(NSError *error))handler {
+  NSDate *startDate = [NSDate date];
+  while (![self areExpectationsFulfilled]) {
+    NSTimeInterval duration = [[NSDate date] timeIntervalSinceDate:startDate];
+    if (duration > timeout) {
+      NSAssert(NO, @"Expectation timed out.");
+      break;
+    }
+    [[NSRunLoop currentRunLoop]
+        runUntilDate:[NSDate dateWithTimeIntervalSinceNow:1]];
+  }
+  handler(nil);
+}
+
+- (BOOL)areExpectationsFulfilled {
+  for (ARDTestExpectation *expectation in _expectations) {
+    if (!expectation.isFulfilled) {
+      return NO;
+    }
+  }
+  return YES;
+}
+
+@end
+
+@interface ARDAppClientTest : ARDTestCase
+@end
+
+@implementation ARDAppClientTest
+
+#pragma mark - Mock helpers
+
+- (id)mockRoomServerClientForRoomId:(NSString *)roomId
+                           clientId:(NSString *)clientId
+                        isInitiator:(BOOL)isInitiator
+                           messages:(NSArray *)messages
+                     messageHandler:
+    (void (^)(ARDSignalingMessage *))messageHandler {
+  id mockRoomServerClient =
+      [OCMockObject mockForProtocol:@protocol(ARDRoomServerClient)];
+
+  // Successful join response.
+  ARDJoinResponse *joinResponse = [[ARDJoinResponse alloc] init];
+  joinResponse.result = kARDJoinResultTypeSuccess;
+  joinResponse.roomId = roomId;
+  joinResponse.clientId = clientId;
+  joinResponse.isInitiator = isInitiator;
+  joinResponse.messages = messages;
+
+  // Successful message response.
+  ARDMessageResponse *messageResponse = [[ARDMessageResponse alloc] init];
+  messageResponse.result = kARDMessageResultTypeSuccess;
+
+  // Return join response from above on join.
+  [[[mockRoomServerClient stub] andDo:^(NSInvocation *invocation) {
+    __unsafe_unretained void (^completionHandler)(ARDJoinResponse *response,
+                                                  NSError *error);
+    [invocation getArgument:&completionHandler atIndex:3];
+    completionHandler(joinResponse, nil);
+  }] joinRoomWithRoomId:roomId completionHandler:[OCMArg any]];
+
+  // Return message response from above on join.
+  [[[mockRoomServerClient stub] andDo:^(NSInvocation *invocation) {
+    __unsafe_unretained ARDSignalingMessage *message;
+    __unsafe_unretained void (^completionHandler)(ARDMessageResponse *response,
+                                                  NSError *error);
+    [invocation getArgument:&message atIndex:2];
+    [invocation getArgument:&completionHandler atIndex:5];
+    messageHandler(message);
+    completionHandler(messageResponse, nil);
+  }] sendMessage:[OCMArg any]
+            forRoomId:roomId
+             clientId:clientId
+    completionHandler:[OCMArg any]];
+
+  // Do nothing on leave.
+  [[[mockRoomServerClient stub] andDo:^(NSInvocation *invocation) {
+    __unsafe_unretained void (^completionHandler)(NSError *error);
+    [invocation getArgument:&completionHandler atIndex:4];
+    if (completionHandler) {
+      completionHandler(nil);
+    }
+  }] leaveRoomWithRoomId:roomId
+                clientId:clientId
+       completionHandler:[OCMArg any]];
+
+  return mockRoomServerClient;
+}
+
+- (id)mockSignalingChannelForRoomId:(NSString *)roomId
+                           clientId:(NSString *)clientId
+                     messageHandler:
+    (void (^)(ARDSignalingMessage *message))messageHandler {
+  id mockSignalingChannel =
+      [OCMockObject niceMockForProtocol:@protocol(ARDSignalingChannel)];
+  [[mockSignalingChannel stub] registerForRoomId:roomId clientId:clientId];
+  [[[mockSignalingChannel stub] andDo:^(NSInvocation *invocation) {
+    __unsafe_unretained ARDSignalingMessage *message;
+    [invocation getArgument:&message atIndex:2];
+    messageHandler(message);
+  }] sendMessage:[OCMArg any]];
+  return mockSignalingChannel;
+}
+
+- (id)mockTURNClient {
+  id mockTURNClient =
+      [OCMockObject mockForProtocol:@protocol(ARDTURNClient)];
+  [[[mockTURNClient stub] andDo:^(NSInvocation *invocation) {
+    // Don't return anything in TURN response.
+    __unsafe_unretained void (^completionHandler)(NSArray *turnServers,
+                                                  NSError *error);
+    [invocation getArgument:&completionHandler atIndex:2];
+    completionHandler([NSArray array], nil);
+  }] requestServersWithCompletionHandler:[OCMArg any]];
+  return mockTURNClient;
+}
+
+- (ARDAppClient *)createAppClientForRoomId:(NSString *)roomId
+                                  clientId:(NSString *)clientId
+                               isInitiator:(BOOL)isInitiator
+                                  messages:(NSArray *)messages
+                            messageHandler:
+    (void (^)(ARDSignalingMessage *message))messageHandler
+                          connectedHandler:(void (^)(void))connectedHandler {
+  id turnClient = [self mockTURNClient];
+  id signalingChannel = [self mockSignalingChannelForRoomId:roomId
+                                                   clientId:clientId
+                                             messageHandler:messageHandler];
+  id roomServerClient =
+      [self mockRoomServerClientForRoomId:roomId
+                                 clientId:clientId
+                              isInitiator:isInitiator
+                                 messages:messages
+                           messageHandler:messageHandler];
+  id delegate =
+      [OCMockObject niceMockForProtocol:@protocol(ARDAppClientDelegate)];
+  [[[delegate stub] andDo:^(NSInvocation *invocation) {
+    connectedHandler();
+  }] appClient:[OCMArg any] didChangeConnectionState:RTCICEConnectionConnected];
+
+  return [[ARDAppClient alloc] initWithRoomServerClient:roomServerClient
+                                       signalingChannel:signalingChannel
+                                             turnClient:turnClient
+                                               delegate:delegate];
+}
+
+// Tests that an ICE connection is established between two ARDAppClient objects
+// where one is set up as a caller and the other the answerer. Network
+// components are mocked out and messages are relayed directly from object to
+// object. It's expected that both clients reach the RTCICEConnectionConnected
+// state within a reasonable amount of time.
+- (void)testSession {
+  // Need block arguments here because we're setting up a callbacks before we
+  // create the clients.
+  ARDAppClient *caller = nil;
+  ARDAppClient *answerer = nil;
+  __block __weak ARDAppClient *weakCaller = nil;
+  __block __weak ARDAppClient *weakAnswerer = nil;
+  NSString *roomId = @"testRoom";
+  NSString *callerId = @"testCallerId";
+  NSString *answererId = @"testAnswererId";
+
+  ARDTestExpectation *callerConnectionExpectation =
+      [self expectationWithDescription:@"Caller PC connected."];
+  ARDTestExpectation *answererConnectionExpectation =
+      [self expectationWithDescription:@"Answerer PC connected."];
+
+  caller = [self createAppClientForRoomId:roomId
+                                 clientId:callerId
+                              isInitiator:YES
+                                 messages:[NSArray array]
+                           messageHandler:^(ARDSignalingMessage *message) {
+    ARDAppClient *strongAnswerer = weakAnswerer;
+    [strongAnswerer channel:strongAnswerer.channel didReceiveMessage:message];
+  } connectedHandler:^{
+    [callerConnectionExpectation fulfill];
+  }];
+  // TODO(tkchin): Figure out why DTLS-SRTP constraint causes thread assertion
+  // crash in Debug.
+  caller.defaultPeerConnectionConstraints = [[RTCMediaConstraints alloc] init];
+  weakCaller = caller;
+
+  answerer = [self createAppClientForRoomId:roomId
+                                   clientId:answererId
+                                isInitiator:NO
+                                   messages:[NSArray array]
+                             messageHandler:^(ARDSignalingMessage *message) {
+    ARDAppClient *strongCaller = weakCaller;
+    [strongCaller channel:strongCaller.channel didReceiveMessage:message];
+  } connectedHandler:^{
+    [answererConnectionExpectation fulfill];
+  }];
+  // TODO(tkchin): Figure out why DTLS-SRTP constraint causes thread assertion
+  // crash in Debug.
+  answerer.defaultPeerConnectionConstraints =
+      [[RTCMediaConstraints alloc] init];
+  weakAnswerer = answerer;
+
+  // Kick off connection.
+  [caller connectToRoomWithId:roomId options:nil];
+  [answerer connectToRoomWithId:roomId options:nil];
+  [self waitForExpectationsWithTimeout:20 handler:^(NSError *error) {
+    if (error) {
+      NSLog(@"Expectations error: %@", error);
+    }
+  }];
+}
+
+@end
+
+@interface ARDSDPUtilsTest : ARDTestCase
+- (void)testPreferVideoCodec;
+@end
+
+@implementation ARDSDPUtilsTest
+
+- (void)testPreferVideoCodec {
+  NSString *sdp = @("m=video 9 RTP/SAVPF 100 116 117 96 120\n"
+                    "a=rtpmap:120 H264/90000\n");
+  NSString *expectedSdp = @("m=video 9 RTP/SAVPF 120 100 116 117 96\n"
+                            "a=rtpmap:120 H264/90000\n");
+  RTCSessionDescription* desc =
+      [[RTCSessionDescription alloc] initWithType:@"offer" sdp:sdp];
+  RTCSessionDescription *h264Desc =
+      [ARDSDPUtils descriptionForDescription:desc
+                         preferredVideoCodec:@"H264"];
+  EXPECT_TRUE([h264Desc.description isEqualToString:expectedSdp]);
+}
+
+@end
+
+class SignalingTest : public ::testing::Test {
+ protected:
+  static void SetUpTestCase() {
+    rtc::InitializeSSL();
+  }
+  static void TearDownTestCase() {
+    rtc::CleanupSSL();
+  }
+};
+
+TEST_F(SignalingTest, SessionTest) {
+  @autoreleasepool {
+    ARDAppClientTest *test = [[ARDAppClientTest alloc] init];
+    [test testSession];
+  }
+}
+
+TEST_F(SignalingTest, SDPTest) {
+  @autoreleasepool {
+    ARDSDPUtilsTest *test = [[ARDSDPUtilsTest alloc] init];
+    [test testPreferVideoCodec];
+  }
+}
+
+
diff --git a/examples/objc/AppRTCDemo/third_party/SocketRocket/LICENSE b/examples/objc/AppRTCDemo/third_party/SocketRocket/LICENSE
new file mode 100644
index 0000000..c01a79c
--- /dev/null
+++ b/examples/objc/AppRTCDemo/third_party/SocketRocket/LICENSE
@@ -0,0 +1,15 @@
+
+   Copyright 2012 Square Inc.
+
+   Licensed under the Apache License, Version 2.0 (the "License");
+   you may not use this file except in compliance with the License.
+   You may obtain a copy of the License at
+
+       http://www.apache.org/licenses/LICENSE-2.0
+
+   Unless required by applicable law or agreed to in writing, software
+   distributed under the License is distributed on an "AS IS" BASIS,
+   WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+   See the License for the specific language governing permissions and
+   limitations under the License.
+
diff --git a/examples/objc/AppRTCDemo/third_party/SocketRocket/SRWebSocket.h b/examples/objc/AppRTCDemo/third_party/SocketRocket/SRWebSocket.h
new file mode 100644
index 0000000..5cce725
--- /dev/null
+++ b/examples/objc/AppRTCDemo/third_party/SocketRocket/SRWebSocket.h
@@ -0,0 +1,132 @@
+//
+//   Copyright 2012 Square Inc.
+//
+//   Licensed under the Apache License, Version 2.0 (the "License");
+//   you may not use this file except in compliance with the License.
+//   You may obtain a copy of the License at
+//
+//       http://www.apache.org/licenses/LICENSE-2.0
+//
+//   Unless required by applicable law or agreed to in writing, software
+//   distributed under the License is distributed on an "AS IS" BASIS,
+//   WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+//   See the License for the specific language governing permissions and
+//   limitations under the License.
+//
+
+#import <Foundation/Foundation.h>
+#import <Security/SecCertificate.h>
+
+typedef enum {
+    SR_CONNECTING   = 0,
+    SR_OPEN         = 1,
+    SR_CLOSING      = 2,
+    SR_CLOSED       = 3,
+} SRReadyState;
+
+typedef enum SRStatusCode : NSInteger {
+    SRStatusCodeNormal = 1000,
+    SRStatusCodeGoingAway = 1001,
+    SRStatusCodeProtocolError = 1002,
+    SRStatusCodeUnhandledType = 1003,
+    // 1004 reserved.
+    SRStatusNoStatusReceived = 1005,
+    // 1004-1006 reserved.
+    SRStatusCodeInvalidUTF8 = 1007,
+    SRStatusCodePolicyViolated = 1008,
+    SRStatusCodeMessageTooBig = 1009,
+} SRStatusCode;
+
+@class SRWebSocket;
+
+extern NSString *const SRWebSocketErrorDomain;
+extern NSString *const SRHTTPResponseErrorKey;
+
+#pragma mark - SRWebSocketDelegate
+
+@protocol SRWebSocketDelegate;
+
+#pragma mark - SRWebSocket
+
+@interface SRWebSocket : NSObject <NSStreamDelegate>
+
+@property (nonatomic, weak) id <SRWebSocketDelegate> delegate;
+
+@property (nonatomic, readonly) SRReadyState readyState;
+@property (nonatomic, readonly, retain) NSURL *url;
+
+// This returns the negotiated protocol.
+// It will be nil until after the handshake completes.
+@property (nonatomic, readonly, copy) NSString *protocol;
+
+// Protocols should be an array of strings that turn into Sec-WebSocket-Protocol.
+- (id)initWithURLRequest:(NSURLRequest *)request protocols:(NSArray *)protocols;
+- (id)initWithURLRequest:(NSURLRequest *)request;
+
+// Some helper constructors.
+- (id)initWithURL:(NSURL *)url protocols:(NSArray *)protocols;
+- (id)initWithURL:(NSURL *)url;
+
+// Delegate queue will be dispatch_main_queue by default.
+// You cannot set both OperationQueue and dispatch_queue.
+- (void)setDelegateOperationQueue:(NSOperationQueue*) queue;
+- (void)setDelegateDispatchQueue:(dispatch_queue_t) queue;
+
+// By default, it will schedule itself on +[NSRunLoop SR_networkRunLoop] using defaultModes.
+- (void)scheduleInRunLoop:(NSRunLoop *)aRunLoop forMode:(NSString *)mode;
+- (void)unscheduleFromRunLoop:(NSRunLoop *)aRunLoop forMode:(NSString *)mode;
+
+// SRWebSockets are intended for one-time-use only.  Open should be called once and only once.
+- (void)open;
+
+- (void)close;
+- (void)closeWithCode:(NSInteger)code reason:(NSString *)reason;
+
+// Send a UTF8 String or Data.
+- (void)send:(id)data;
+
+// Send Data (can be nil) in a ping message.
+- (void)sendPing:(NSData *)data;
+
+@end
+
+#pragma mark - SRWebSocketDelegate
+
+@protocol SRWebSocketDelegate <NSObject>
+
+// message will either be an NSString if the server is using text
+// or NSData if the server is using binary.
+- (void)webSocket:(SRWebSocket *)webSocket didReceiveMessage:(id)message;
+
+@optional
+
+- (void)webSocketDidOpen:(SRWebSocket *)webSocket;
+- (void)webSocket:(SRWebSocket *)webSocket didFailWithError:(NSError *)error;
+- (void)webSocket:(SRWebSocket *)webSocket didCloseWithCode:(NSInteger)code reason:(NSString *)reason wasClean:(BOOL)wasClean;
+- (void)webSocket:(SRWebSocket *)webSocket didReceivePong:(NSData *)pongPayload;
+
+@end
+
+#pragma mark - NSURLRequest (CertificateAdditions)
+
+@interface NSURLRequest (CertificateAdditions)
+
+@property (nonatomic, retain, readonly) NSArray *SR_SSLPinnedCertificates;
+
+@end
+
+#pragma mark - NSMutableURLRequest (CertificateAdditions)
+
+@interface NSMutableURLRequest (CertificateAdditions)
+
+@property (nonatomic, retain) NSArray *SR_SSLPinnedCertificates;
+
+@end
+
+#pragma mark - NSRunLoop (SRWebSocket)
+
+@interface NSRunLoop (SRWebSocket)
+
++ (NSRunLoop *)SR_networkRunLoop;
+
+@end
diff --git a/examples/objc/AppRTCDemo/third_party/SocketRocket/SRWebSocket.m b/examples/objc/AppRTCDemo/third_party/SocketRocket/SRWebSocket.m
new file mode 100644
index 0000000..b8add7f
--- /dev/null
+++ b/examples/objc/AppRTCDemo/third_party/SocketRocket/SRWebSocket.m
@@ -0,0 +1,1761 @@
+//
+//   Copyright 2012 Square Inc.
+//
+//   Licensed under the Apache License, Version 2.0 (the "License");
+//   you may not use this file except in compliance with the License.
+//   You may obtain a copy of the License at
+//
+//       http://www.apache.org/licenses/LICENSE-2.0
+//
+//   Unless required by applicable law or agreed to in writing, software
+//   distributed under the License is distributed on an "AS IS" BASIS,
+//   WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+//   See the License for the specific language governing permissions and
+//   limitations under the License.
+//
+
+
+#import "SRWebSocket.h"
+
+#if TARGET_OS_IPHONE
+#define HAS_ICU
+#endif
+
+#ifdef HAS_ICU
+#import <unicode/utf8.h>
+#endif
+
+#if TARGET_OS_IPHONE
+#import <Endian.h>
+#else
+#import <CoreServices/CoreServices.h>
+#endif
+
+#import <CommonCrypto/CommonDigest.h>
+#import <Security/SecRandom.h>
+
+#if OS_OBJECT_USE_OBJC_RETAIN_RELEASE
+#define sr_dispatch_retain(x)
+#define sr_dispatch_release(x)
+#define maybe_bridge(x) ((__bridge void *) x)
+#else
+#define sr_dispatch_retain(x) dispatch_retain(x)
+#define sr_dispatch_release(x) dispatch_release(x)
+#define maybe_bridge(x) (x)
+#endif
+
+#if !__has_feature(objc_arc) 
+#error SocketRocket must be compiled with ARC enabled
+#endif
+
+
+typedef enum  {
+    SROpCodeTextFrame = 0x1,
+    SROpCodeBinaryFrame = 0x2,
+    // 3-7 reserved.
+    SROpCodeConnectionClose = 0x8,
+    SROpCodePing = 0x9,
+    SROpCodePong = 0xA,
+    // B-F reserved.
+} SROpCode;
+
+typedef struct {
+    BOOL fin;
+//  BOOL rsv1;
+//  BOOL rsv2;
+//  BOOL rsv3;
+    uint8_t opcode;
+    BOOL masked;
+    uint64_t payload_length;
+} frame_header;
+
+static NSString *const SRWebSocketAppendToSecKeyString = @"258EAFA5-E914-47DA-95CA-C5AB0DC85B11";
+
+static inline int32_t validate_dispatch_data_partial_string(NSData *data);
+static inline void SRFastLog(NSString *format, ...);
+
+@interface NSData (SRWebSocket)
+
+- (NSString *)stringBySHA1ThenBase64Encoding;
+
+@end
+
+
+@interface NSString (SRWebSocket)
+
+- (NSString *)stringBySHA1ThenBase64Encoding;
+
+@end
+
+
+@interface NSURL (SRWebSocket)
+
+// The origin isn't really applicable for a native application.
+// So instead, just map ws -> http and wss -> https.
+- (NSString *)SR_origin;
+
+@end
+
+
+@interface _SRRunLoopThread : NSThread
+
+@property (nonatomic, readonly) NSRunLoop *runLoop;
+
+@end
+
+
+static NSString *newSHA1String(const char *bytes, size_t length) {
+    uint8_t md[CC_SHA1_DIGEST_LENGTH];
+
+    assert(length >= 0);
+    assert(length <= UINT32_MAX);
+    CC_SHA1(bytes, (CC_LONG)length, md);
+    
+    NSData *data = [NSData dataWithBytes:md length:CC_SHA1_DIGEST_LENGTH];
+    
+    if ([data respondsToSelector:@selector(base64EncodedStringWithOptions:)]) {
+        return [data base64EncodedStringWithOptions:0];
+    }
+    
+    return [data base64Encoding];
+}
+
+@implementation NSData (SRWebSocket)
+
+- (NSString *)stringBySHA1ThenBase64Encoding;
+{
+    return newSHA1String(self.bytes, self.length);
+}
+
+@end
+
+
+@implementation NSString (SRWebSocket)
+
+- (NSString *)stringBySHA1ThenBase64Encoding;
+{
+    return newSHA1String(self.UTF8String, self.length);
+}
+
+@end
+
+NSString *const SRWebSocketErrorDomain = @"SRWebSocketErrorDomain";
+NSString *const SRHTTPResponseErrorKey = @"HTTPResponseStatusCode";
+
+// Returns number of bytes consumed. Returning 0 means you didn't match.
+// Sends bytes to callback handler;
+typedef size_t (^stream_scanner)(NSData *collected_data);
+
+typedef void (^data_callback)(SRWebSocket *webSocket,  NSData *data);
+
+@interface SRIOConsumer : NSObject {
+    stream_scanner _scanner;
+    data_callback _handler;
+    size_t _bytesNeeded;
+    BOOL _readToCurrentFrame;
+    BOOL _unmaskBytes;
+}
+@property (nonatomic, copy, readonly) stream_scanner consumer;
+@property (nonatomic, copy, readonly) data_callback handler;
+@property (nonatomic, assign) size_t bytesNeeded;
+@property (nonatomic, assign, readonly) BOOL readToCurrentFrame;
+@property (nonatomic, assign, readonly) BOOL unmaskBytes;
+
+@end
+
+// This class is not thread-safe, and is expected to always be run on the same queue.
+@interface SRIOConsumerPool : NSObject
+
+- (id)initWithBufferCapacity:(NSUInteger)poolSize;
+
+- (SRIOConsumer *)consumerWithScanner:(stream_scanner)scanner handler:(data_callback)handler bytesNeeded:(size_t)bytesNeeded readToCurrentFrame:(BOOL)readToCurrentFrame unmaskBytes:(BOOL)unmaskBytes;
+- (void)returnConsumer:(SRIOConsumer *)consumer;
+
+@end
+
+@interface SRWebSocket ()  <NSStreamDelegate>
+
+- (void)_writeData:(NSData *)data;
+- (void)_closeWithProtocolError:(NSString *)message;
+- (void)_failWithError:(NSError *)error;
+
+- (void)_disconnect;
+
+- (void)_readFrameNew;
+- (void)_readFrameContinue;
+
+- (void)_pumpScanner;
+
+- (void)_pumpWriting;
+
+- (void)_addConsumerWithScanner:(stream_scanner)consumer callback:(data_callback)callback;
+- (void)_addConsumerWithDataLength:(size_t)dataLength callback:(data_callback)callback readToCurrentFrame:(BOOL)readToCurrentFrame unmaskBytes:(BOOL)unmaskBytes;
+- (void)_addConsumerWithScanner:(stream_scanner)consumer callback:(data_callback)callback dataLength:(size_t)dataLength;
+- (void)_readUntilBytes:(const void *)bytes length:(size_t)length callback:(data_callback)dataHandler;
+- (void)_readUntilHeaderCompleteWithCallback:(data_callback)dataHandler;
+
+- (void)_sendFrameWithOpcode:(SROpCode)opcode data:(id)data;
+
+- (BOOL)_checkHandshake:(CFHTTPMessageRef)httpMessage;
+- (void)_SR_commonInit;
+
+- (void)_initializeStreams;
+- (void)_connect;
+
+@property (nonatomic) SRReadyState readyState;
+
+@property (nonatomic) NSOperationQueue *delegateOperationQueue;
+@property (nonatomic) dispatch_queue_t delegateDispatchQueue;
+
+@end
+
+
+@implementation SRWebSocket {
+    NSInteger _webSocketVersion;
+    
+    NSOperationQueue *_delegateOperationQueue;
+    dispatch_queue_t _delegateDispatchQueue;
+    
+    dispatch_queue_t _workQueue;
+    NSMutableArray *_consumers;
+
+    NSInputStream *_inputStream;
+    NSOutputStream *_outputStream;
+   
+    NSMutableData *_readBuffer;
+    NSUInteger _readBufferOffset;
+ 
+    NSMutableData *_outputBuffer;
+    NSUInteger _outputBufferOffset;
+
+    uint8_t _currentFrameOpcode;
+    size_t _currentFrameCount;
+    size_t _readOpCount;
+    uint32_t _currentStringScanPosition;
+    NSMutableData *_currentFrameData;
+    
+    NSString *_closeReason;
+    
+    NSString *_secKey;
+    
+    BOOL _pinnedCertFound;
+    
+    uint8_t _currentReadMaskKey[4];
+    size_t _currentReadMaskOffset;
+
+    BOOL _consumerStopped;
+    
+    BOOL _closeWhenFinishedWriting;
+    BOOL _failed;
+
+    BOOL _secure;
+    NSURLRequest *_urlRequest;
+
+    CFHTTPMessageRef _receivedHTTPHeaders;
+    
+    BOOL _sentClose;
+    BOOL _didFail;
+    int _closeCode;
+    
+    BOOL _isPumping;
+    
+    NSMutableSet *_scheduledRunloops;
+    
+    // We use this to retain ourselves.
+    __strong SRWebSocket *_selfRetain;
+    
+    NSArray *_requestedProtocols;
+    SRIOConsumerPool *_consumerPool;
+}
+
+@synthesize delegate = _delegate;
+@synthesize url = _url;
+@synthesize readyState = _readyState;
+@synthesize protocol = _protocol;
+
+static __strong NSData *CRLFCRLF;
+
++ (void)initialize;
+{
+    CRLFCRLF = [[NSData alloc] initWithBytes:"\r\n\r\n" length:4];
+}
+
+- (id)initWithURLRequest:(NSURLRequest *)request protocols:(NSArray *)protocols;
+{
+    self = [super init];
+    if (self) {
+        assert(request.URL);
+        _url = request.URL;
+        _urlRequest = request;
+        
+        _requestedProtocols = [protocols copy];
+        
+        [self _SR_commonInit];
+    }
+    
+    return self;
+}
+
+- (id)initWithURLRequest:(NSURLRequest *)request;
+{
+    return [self initWithURLRequest:request protocols:nil];
+}
+
+- (id)initWithURL:(NSURL *)url;
+{
+    return [self initWithURL:url protocols:nil];
+}
+
+- (id)initWithURL:(NSURL *)url protocols:(NSArray *)protocols;
+{
+    NSMutableURLRequest *request = [[NSMutableURLRequest alloc] initWithURL:url];    
+    return [self initWithURLRequest:request protocols:protocols];
+}
+
+- (void)_SR_commonInit;
+{
+    
+    NSString *scheme = _url.scheme.lowercaseString;
+    assert([scheme isEqualToString:@"ws"] || [scheme isEqualToString:@"http"] || [scheme isEqualToString:@"wss"] || [scheme isEqualToString:@"https"]);
+    
+    if ([scheme isEqualToString:@"wss"] || [scheme isEqualToString:@"https"]) {
+        _secure = YES;
+    }
+    
+    _readyState = SR_CONNECTING;
+    _consumerStopped = YES;
+    _webSocketVersion = 13;
+    
+    _workQueue = dispatch_queue_create(NULL, DISPATCH_QUEUE_SERIAL);
+    
+    // Going to set a specific on the queue so we can validate we're on the work queue
+    dispatch_queue_set_specific(_workQueue, (__bridge void *)self, maybe_bridge(_workQueue), NULL);
+    
+    _delegateDispatchQueue = dispatch_get_main_queue();
+    sr_dispatch_retain(_delegateDispatchQueue);
+    
+    _readBuffer = [[NSMutableData alloc] init];
+    _outputBuffer = [[NSMutableData alloc] init];
+    
+    _currentFrameData = [[NSMutableData alloc] init];
+
+    _consumers = [[NSMutableArray alloc] init];
+    
+    _consumerPool = [[SRIOConsumerPool alloc] init];
+    
+    _scheduledRunloops = [[NSMutableSet alloc] init];
+    
+    [self _initializeStreams];
+    
+    // default handlers
+}
+
+- (void)assertOnWorkQueue;
+{
+    assert(dispatch_get_specific((__bridge void *)self) == maybe_bridge(_workQueue));
+}
+
+- (void)dealloc
+{
+    _inputStream.delegate = nil;
+    _outputStream.delegate = nil;
+
+    [_inputStream close];
+    [_outputStream close];
+    
+    sr_dispatch_release(_workQueue);
+    _workQueue = NULL;
+    
+    if (_receivedHTTPHeaders) {
+        CFRelease(_receivedHTTPHeaders);
+        _receivedHTTPHeaders = NULL;
+    }
+    
+    if (_delegateDispatchQueue) {
+        sr_dispatch_release(_delegateDispatchQueue);
+        _delegateDispatchQueue = NULL;
+    }
+}
+
+#ifndef NDEBUG
+
+- (void)setReadyState:(SRReadyState)aReadyState;
+{
+    [self willChangeValueForKey:@"readyState"];
+    assert(aReadyState > _readyState);
+    _readyState = aReadyState;
+    [self didChangeValueForKey:@"readyState"];
+}
+
+#endif
+
+- (void)open;
+{
+    assert(_url);
+    NSAssert(_readyState == SR_CONNECTING, @"Cannot call -(void)open on SRWebSocket more than once");
+
+    _selfRetain = self;
+    
+    [self _connect];
+}
+
+// Calls block on delegate queue
+- (void)_performDelegateBlock:(dispatch_block_t)block;
+{
+    if (_delegateOperationQueue) {
+        [_delegateOperationQueue addOperationWithBlock:block];
+    } else {
+        assert(_delegateDispatchQueue);
+        dispatch_async(_delegateDispatchQueue, block);
+    }
+}
+
+- (void)setDelegateDispatchQueue:(dispatch_queue_t)queue;
+{
+    if (queue) {
+        sr_dispatch_retain(queue);
+    }
+    
+    if (_delegateDispatchQueue) {
+        sr_dispatch_release(_delegateDispatchQueue);
+    }
+    
+    _delegateDispatchQueue = queue;
+}
+
+- (BOOL)_checkHandshake:(CFHTTPMessageRef)httpMessage;
+{
+    NSString *acceptHeader = CFBridgingRelease(CFHTTPMessageCopyHeaderFieldValue(httpMessage, CFSTR("Sec-WebSocket-Accept")));
+
+    if (acceptHeader == nil) {
+        return NO;
+    }
+    
+    NSString *concattedString = [_secKey stringByAppendingString:SRWebSocketAppendToSecKeyString];
+    NSString *expectedAccept = [concattedString stringBySHA1ThenBase64Encoding];
+    
+    return [acceptHeader isEqualToString:expectedAccept];
+}
+
+- (void)_HTTPHeadersDidFinish;
+{
+    NSInteger responseCode = CFHTTPMessageGetResponseStatusCode(_receivedHTTPHeaders);
+    
+    if (responseCode >= 400) {
+        SRFastLog(@"Request failed with response code %d", responseCode);
+        [self _failWithError:[NSError errorWithDomain:SRWebSocketErrorDomain code:2132 userInfo:@{NSLocalizedDescriptionKey:[NSString stringWithFormat:@"received bad response code from server %ld", (long)responseCode], SRHTTPResponseErrorKey:@(responseCode)}]];
+        return;
+    }
+    
+    if(![self _checkHandshake:_receivedHTTPHeaders]) {
+        [self _failWithError:[NSError errorWithDomain:SRWebSocketErrorDomain code:2133 userInfo:[NSDictionary dictionaryWithObject:[NSString stringWithFormat:@"Invalid Sec-WebSocket-Accept response"] forKey:NSLocalizedDescriptionKey]]];
+        return;
+    }
+    
+    NSString *negotiatedProtocol = CFBridgingRelease(CFHTTPMessageCopyHeaderFieldValue(_receivedHTTPHeaders, CFSTR("Sec-WebSocket-Protocol")));
+    if (negotiatedProtocol) {
+        // Make sure we requested the protocol
+        if ([_requestedProtocols indexOfObject:negotiatedProtocol] == NSNotFound) {
+            [self _failWithError:[NSError errorWithDomain:SRWebSocketErrorDomain code:2133 userInfo:[NSDictionary dictionaryWithObject:[NSString stringWithFormat:@"Server specified Sec-WebSocket-Protocol that wasn't requested"] forKey:NSLocalizedDescriptionKey]]];
+            return;
+        }
+        
+        _protocol = negotiatedProtocol;
+    }
+    
+    self.readyState = SR_OPEN;
+    
+    if (!_didFail) {
+        [self _readFrameNew];
+    }
+
+    [self _performDelegateBlock:^{
+        if ([self.delegate respondsToSelector:@selector(webSocketDidOpen:)]) {
+            [self.delegate webSocketDidOpen:self];
+        };
+    }];
+}
+
+
+- (void)_readHTTPHeader;
+{
+    if (_receivedHTTPHeaders == NULL) {
+        _receivedHTTPHeaders = CFHTTPMessageCreateEmpty(NULL, NO);
+    }
+                        
+    [self _readUntilHeaderCompleteWithCallback:^(SRWebSocket *self,  NSData *data) {
+        CFHTTPMessageAppendBytes(_receivedHTTPHeaders, (const UInt8 *)data.bytes, data.length);
+        
+        if (CFHTTPMessageIsHeaderComplete(_receivedHTTPHeaders)) {
+            SRFastLog(@"Finished reading headers %@", CFBridgingRelease(CFHTTPMessageCopyAllHeaderFields(_receivedHTTPHeaders)));
+            [self _HTTPHeadersDidFinish];
+        } else {
+            [self _readHTTPHeader];
+        }
+    }];
+}
+
+- (void)didConnect
+{
+    SRFastLog(@"Connected");
+    CFHTTPMessageRef request = CFHTTPMessageCreateRequest(NULL, CFSTR("GET"), (__bridge CFURLRef)_url, kCFHTTPVersion1_1);
+    
+    // Set host first so it defaults
+    CFHTTPMessageSetHeaderFieldValue(request, CFSTR("Host"), (__bridge CFStringRef)(_url.port ? [NSString stringWithFormat:@"%@:%@", _url.host, _url.port] : _url.host));
+        
+    NSMutableData *keyBytes = [[NSMutableData alloc] initWithLength:16];
+    SecRandomCopyBytes(kSecRandomDefault, keyBytes.length, keyBytes.mutableBytes);
+    
+    if ([keyBytes respondsToSelector:@selector(base64EncodedStringWithOptions:)]) {
+        _secKey = [keyBytes base64EncodedStringWithOptions:0];
+    } else {
+        _secKey = [keyBytes base64Encoding];
+    }
+    
+    assert([_secKey length] == 24);
+    
+    CFHTTPMessageSetHeaderFieldValue(request, CFSTR("Upgrade"), CFSTR("websocket"));
+    CFHTTPMessageSetHeaderFieldValue(request, CFSTR("Connection"), CFSTR("Upgrade"));
+    CFHTTPMessageSetHeaderFieldValue(request, CFSTR("Sec-WebSocket-Key"), (__bridge CFStringRef)_secKey);
+    CFHTTPMessageSetHeaderFieldValue(request, CFSTR("Sec-WebSocket-Version"), (__bridge CFStringRef)[NSString stringWithFormat:@"%ld", (long)_webSocketVersion]);
+    
+    CFHTTPMessageSetHeaderFieldValue(request, CFSTR("Origin"), (__bridge CFStringRef)_url.SR_origin);
+    
+    if (_requestedProtocols) {
+        CFHTTPMessageSetHeaderFieldValue(request, CFSTR("Sec-WebSocket-Protocol"), (__bridge CFStringRef)[_requestedProtocols componentsJoinedByString:@", "]);
+    }
+
+    [_urlRequest.allHTTPHeaderFields enumerateKeysAndObjectsUsingBlock:^(id key, id obj, BOOL *stop) {
+        CFHTTPMessageSetHeaderFieldValue(request, (__bridge CFStringRef)key, (__bridge CFStringRef)obj);
+    }];
+    
+    NSData *message = CFBridgingRelease(CFHTTPMessageCopySerializedMessage(request));
+    
+    CFRelease(request);
+
+    [self _writeData:message];
+    [self _readHTTPHeader];
+}
+
+- (void)_initializeStreams;
+{
+    assert(_url.port.unsignedIntValue <= UINT32_MAX);
+    uint32_t port = _url.port.unsignedIntValue;
+    if (port == 0) {
+        if (!_secure) {
+            port = 80;
+        } else {
+            port = 443;
+        }
+    }
+    NSString *host = _url.host;
+    
+    CFReadStreamRef readStream = NULL;
+    CFWriteStreamRef writeStream = NULL;
+    
+    CFStreamCreatePairWithSocketToHost(NULL, (__bridge CFStringRef)host, port, &readStream, &writeStream);
+    
+    _outputStream = CFBridgingRelease(writeStream);
+    _inputStream = CFBridgingRelease(readStream);
+    
+    
+    if (_secure) {
+        NSMutableDictionary *SSLOptions = [[NSMutableDictionary alloc] init];
+        
+        [_outputStream setProperty:(__bridge id)kCFStreamSocketSecurityLevelNegotiatedSSL forKey:(__bridge id)kCFStreamPropertySocketSecurityLevel];
+        
+        // If we're using pinned certs, don't validate the certificate chain
+        if ([_urlRequest SR_SSLPinnedCertificates].count) {
+            [SSLOptions setValue:[NSNumber numberWithBool:NO] forKey:(__bridge id)kCFStreamSSLValidatesCertificateChain];
+        }
+        
+#if DEBUG
+        [SSLOptions setValue:[NSNumber numberWithBool:NO] forKey:(__bridge id)kCFStreamSSLValidatesCertificateChain];
+        NSLog(@"SocketRocket: In debug mode.  Allowing connection to any root cert");
+#endif
+        
+        [_outputStream setProperty:SSLOptions
+                            forKey:(__bridge id)kCFStreamPropertySSLSettings];
+    }
+    
+    _inputStream.delegate = self;
+    _outputStream.delegate = self;
+}
+
+- (void)_connect;
+{
+    if (!_scheduledRunloops.count) {
+        [self scheduleInRunLoop:[NSRunLoop SR_networkRunLoop] forMode:NSDefaultRunLoopMode];
+    }
+    
+    
+    [_outputStream open];
+    [_inputStream open];
+}
+
+- (void)scheduleInRunLoop:(NSRunLoop *)aRunLoop forMode:(NSString *)mode;
+{
+    [_outputStream scheduleInRunLoop:aRunLoop forMode:mode];
+    [_inputStream scheduleInRunLoop:aRunLoop forMode:mode];
+    
+    [_scheduledRunloops addObject:@[aRunLoop, mode]];
+}
+
+- (void)unscheduleFromRunLoop:(NSRunLoop *)aRunLoop forMode:(NSString *)mode;
+{
+    [_outputStream removeFromRunLoop:aRunLoop forMode:mode];
+    [_inputStream removeFromRunLoop:aRunLoop forMode:mode];
+    
+    [_scheduledRunloops removeObject:@[aRunLoop, mode]];
+}
+
+- (void)close;
+{
+    [self closeWithCode:SRStatusCodeNormal reason:nil];
+}
+
+- (void)closeWithCode:(NSInteger)code reason:(NSString *)reason;
+{
+    assert(code);
+    dispatch_async(_workQueue, ^{
+        if (self.readyState == SR_CLOSING || self.readyState == SR_CLOSED) {
+            return;
+        }
+        
+        BOOL wasConnecting = self.readyState == SR_CONNECTING;
+        
+        self.readyState = SR_CLOSING;
+        
+        SRFastLog(@"Closing with code %d reason %@", code, reason);
+        
+        if (wasConnecting) {
+            [self _disconnect];
+            return;
+        }
+
+        size_t maxMsgSize = [reason maximumLengthOfBytesUsingEncoding:NSUTF8StringEncoding];
+        NSMutableData *mutablePayload = [[NSMutableData alloc] initWithLength:sizeof(uint16_t) + maxMsgSize];
+        NSData *payload = mutablePayload;
+        
+        ((uint16_t *)mutablePayload.mutableBytes)[0] = EndianU16_BtoN(code);
+        
+        if (reason) {
+            NSRange remainingRange = {0};
+            
+            NSUInteger usedLength = 0;
+            
+            BOOL success = [reason getBytes:(char *)mutablePayload.mutableBytes + sizeof(uint16_t) maxLength:payload.length - sizeof(uint16_t) usedLength:&usedLength encoding:NSUTF8StringEncoding options:NSStringEncodingConversionExternalRepresentation range:NSMakeRange(0, reason.length) remainingRange:&remainingRange];
+            
+            assert(success);
+            assert(remainingRange.length == 0);
+
+            if (usedLength != maxMsgSize) {
+                payload = [payload subdataWithRange:NSMakeRange(0, usedLength + sizeof(uint16_t))];
+            }
+        }
+        
+        
+        [self _sendFrameWithOpcode:SROpCodeConnectionClose data:payload];
+    });
+}
+
+- (void)_closeWithProtocolError:(NSString *)message;
+{
+    // Need to shunt this on the _callbackQueue first to see if they received any messages 
+    [self _performDelegateBlock:^{
+        [self closeWithCode:SRStatusCodeProtocolError reason:message];
+        dispatch_async(_workQueue, ^{
+            [self _disconnect];
+        });
+    }];
+}
+
+- (void)_failWithError:(NSError *)error;
+{
+    dispatch_async(_workQueue, ^{
+        if (self.readyState != SR_CLOSED) {
+            _failed = YES;
+            [self _performDelegateBlock:^{
+                if ([self.delegate respondsToSelector:@selector(webSocket:didFailWithError:)]) {
+                    [self.delegate webSocket:self didFailWithError:error];
+                }
+            }];
+
+            self.readyState = SR_CLOSED;
+            _selfRetain = nil;
+
+            SRFastLog(@"Failing with error %@", error.localizedDescription);
+            
+            [self _disconnect];
+        }
+    });
+}
+
+- (void)_writeData:(NSData *)data;
+{    
+    [self assertOnWorkQueue];
+
+    if (_closeWhenFinishedWriting) {
+            return;
+    }
+    [_outputBuffer appendData:data];
+    [self _pumpWriting];
+}
+
+- (void)send:(id)data;
+{
+    NSAssert(self.readyState != SR_CONNECTING, @"Invalid State: Cannot call send: until connection is open");
+    // TODO: maybe not copy this for performance
+    data = [data copy];
+    dispatch_async(_workQueue, ^{
+        if ([data isKindOfClass:[NSString class]]) {
+            [self _sendFrameWithOpcode:SROpCodeTextFrame data:[(NSString *)data dataUsingEncoding:NSUTF8StringEncoding]];
+        } else if ([data isKindOfClass:[NSData class]]) {
+            [self _sendFrameWithOpcode:SROpCodeBinaryFrame data:data];
+        } else if (data == nil) {
+            [self _sendFrameWithOpcode:SROpCodeTextFrame data:data];
+        } else {
+            assert(NO);
+        }
+    });
+}
+
+- (void)sendPing:(NSData *)data;
+{
+    NSAssert(self.readyState == SR_OPEN, @"Invalid State: Cannot call send: until connection is open");
+    // TODO: maybe not copy this for performance
+    data = [data copy] ?: [NSData data]; // It's okay for a ping to be empty
+    dispatch_async(_workQueue, ^{
+        [self _sendFrameWithOpcode:SROpCodePing data:data];
+    });
+}
+
+- (void)handlePing:(NSData *)pingData;
+{
+    // Need to pingpong this off _callbackQueue first to make sure messages happen in order
+    [self _performDelegateBlock:^{
+        dispatch_async(_workQueue, ^{
+            [self _sendFrameWithOpcode:SROpCodePong data:pingData];
+        });
+    }];
+}
+
+- (void)handlePong:(NSData *)pongData;
+{
+    SRFastLog(@"Received pong");
+    [self _performDelegateBlock:^{
+        if ([self.delegate respondsToSelector:@selector(webSocket:didReceivePong:)]) {
+            [self.delegate webSocket:self didReceivePong:pongData];
+        }
+    }];
+}
+
+- (void)_handleMessage:(id)message
+{
+    SRFastLog(@"Received message");
+    [self _performDelegateBlock:^{
+        [self.delegate webSocket:self didReceiveMessage:message];
+    }];
+}
+
+
+static inline BOOL closeCodeIsValid(int closeCode) {
+    if (closeCode < 1000) {
+        return NO;
+    }
+    
+    if (closeCode >= 1000 && closeCode <= 1011) {
+        if (closeCode == 1004 ||
+            closeCode == 1005 ||
+            closeCode == 1006) {
+            return NO;
+        }
+        return YES;
+    }
+    
+    if (closeCode >= 3000 && closeCode <= 3999) {
+        return YES;
+    }
+    
+    if (closeCode >= 4000 && closeCode <= 4999) {
+        return YES;
+    }
+
+    return NO;
+}
+
+//  Note from RFC:
+//
+//  If there is a body, the first two
+//  bytes of the body MUST be a 2-byte unsigned integer (in network byte
+//  order) representing a status code with value /code/ defined in
+//  Section 7.4.  Following the 2-byte integer the body MAY contain UTF-8
+//  encoded data with value /reason/, the interpretation of which is not
+//  defined by this specification.
+
+- (void)handleCloseWithData:(NSData *)data;
+{
+    size_t dataSize = data.length;
+    __block uint16_t closeCode = 0;
+    
+    SRFastLog(@"Received close frame");
+    
+    if (dataSize == 1) {
+        // TODO handle error
+        [self _closeWithProtocolError:@"Payload for close must be larger than 2 bytes"];
+        return;
+    } else if (dataSize >= 2) {
+        [data getBytes:&closeCode length:sizeof(closeCode)];
+        _closeCode = EndianU16_BtoN(closeCode);
+        if (!closeCodeIsValid(_closeCode)) {
+            [self _closeWithProtocolError:[NSString stringWithFormat:@"Cannot have close code of %d", _closeCode]];
+            return;
+        }
+        if (dataSize > 2) {
+            _closeReason = [[NSString alloc] initWithData:[data subdataWithRange:NSMakeRange(2, dataSize - 2)] encoding:NSUTF8StringEncoding];
+            if (!_closeReason) {
+                [self _closeWithProtocolError:@"Close reason MUST be valid UTF-8"];
+                return;
+            }
+        }
+    } else {
+        _closeCode = SRStatusNoStatusReceived;
+    }
+    
+    [self assertOnWorkQueue];
+    
+    if (self.readyState == SR_OPEN) {
+        [self closeWithCode:1000 reason:nil];
+    }
+    dispatch_async(_workQueue, ^{
+        [self _disconnect];
+    });
+}
+
+- (void)_disconnect;
+{
+    [self assertOnWorkQueue];
+    SRFastLog(@"Trying to disconnect");
+    _closeWhenFinishedWriting = YES;
+    [self _pumpWriting];
+}
+
+- (void)_handleFrameWithData:(NSData *)frameData opCode:(NSInteger)opcode;
+{                
+    // Check that the current data is valid UTF8
+    
+    BOOL isControlFrame = (opcode == SROpCodePing || opcode == SROpCodePong || opcode == SROpCodeConnectionClose);
+    if (!isControlFrame) {
+        [self _readFrameNew];
+    } else {
+        dispatch_async(_workQueue, ^{
+            [self _readFrameContinue];
+        });
+    }
+    
+    switch (opcode) {
+        case SROpCodeTextFrame: {
+            NSString *str = [[NSString alloc] initWithData:frameData encoding:NSUTF8StringEncoding];
+            if (str == nil && frameData) {
+                [self closeWithCode:SRStatusCodeInvalidUTF8 reason:@"Text frames must be valid UTF-8"];
+                dispatch_async(_workQueue, ^{
+                    [self _disconnect];
+                });
+
+                return;
+            }
+            [self _handleMessage:str];
+            break;
+        }
+        case SROpCodeBinaryFrame:
+            [self _handleMessage:[frameData copy]];
+            break;
+        case SROpCodeConnectionClose:
+            [self handleCloseWithData:frameData];
+            break;
+        case SROpCodePing:
+            [self handlePing:frameData];
+            break;
+        case SROpCodePong:
+            [self handlePong:frameData];
+            break;
+        default:
+            [self _closeWithProtocolError:[NSString stringWithFormat:@"Unknown opcode %ld", (long)opcode]];
+            // TODO: Handle invalid opcode
+            break;
+    }
+}
+
+- (void)_handleFrameHeader:(frame_header)frame_header curData:(NSData *)curData;
+{
+    assert(frame_header.opcode != 0);
+    
+    if (self.readyState != SR_OPEN) {
+        return;
+    }
+    
+    
+    BOOL isControlFrame = (frame_header.opcode == SROpCodePing || frame_header.opcode == SROpCodePong || frame_header.opcode == SROpCodeConnectionClose);
+    
+    if (isControlFrame && !frame_header.fin) {
+        [self _closeWithProtocolError:@"Fragmented control frames not allowed"];
+        return;
+    }
+    
+    if (isControlFrame && frame_header.payload_length >= 126) {
+        [self _closeWithProtocolError:@"Control frames cannot have payloads larger than 126 bytes"];
+        return;
+    }
+    
+    if (!isControlFrame) {
+        _currentFrameOpcode = frame_header.opcode;
+        _currentFrameCount += 1;
+    }
+    
+    if (frame_header.payload_length == 0) {
+        if (isControlFrame) {
+            [self _handleFrameWithData:curData opCode:frame_header.opcode];
+        } else {
+            if (frame_header.fin) {
+                [self _handleFrameWithData:_currentFrameData opCode:frame_header.opcode];
+            } else {
+                // TODO add assert that opcode is not a control;
+                [self _readFrameContinue];
+            }
+        }
+    } else {
+        assert(frame_header.payload_length <= SIZE_T_MAX);
+        [self _addConsumerWithDataLength:(size_t)frame_header.payload_length callback:^(SRWebSocket *self, NSData *newData) {
+            if (isControlFrame) {
+                [self _handleFrameWithData:newData opCode:frame_header.opcode];
+            } else {
+                if (frame_header.fin) {
+                    [self _handleFrameWithData:self->_currentFrameData opCode:frame_header.opcode];
+                } else {
+                    // TODO add assert that opcode is not a control;
+                    [self _readFrameContinue];
+                }
+                
+            }
+        } readToCurrentFrame:!isControlFrame unmaskBytes:frame_header.masked];
+    }
+}
+
+/* From RFC:
+
+ 0                   1                   2                   3
+ 0 1 2 3 4 5 6 7 8 9 0 1 2 3 4 5 6 7 8 9 0 1 2 3 4 5 6 7 8 9 0 1
+ +-+-+-+-+-------+-+-------------+-------------------------------+
+ |F|R|R|R| opcode|M| Payload len |    Extended payload length    |
+ |I|S|S|S|  (4)  |A|     (7)     |             (16/64)           |
+ |N|V|V|V|       |S|             |   (if payload len==126/127)   |
+ | |1|2|3|       |K|             |                               |
+ +-+-+-+-+-------+-+-------------+ - - - - - - - - - - - - - - - +
+ |     Extended payload length continued, if payload len == 127  |
+ + - - - - - - - - - - - - - - - +-------------------------------+
+ |                               |Masking-key, if MASK set to 1  |
+ +-------------------------------+-------------------------------+
+ | Masking-key (continued)       |          Payload Data         |
+ +-------------------------------- - - - - - - - - - - - - - - - +
+ :                     Payload Data continued ...                :
+ + - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - +
+ |                     Payload Data continued ...                |
+ +---------------------------------------------------------------+
+ */
+
+static const uint8_t SRFinMask          = 0x80;
+static const uint8_t SROpCodeMask       = 0x0F;
+static const uint8_t SRRsvMask          = 0x70;
+static const uint8_t SRMaskMask         = 0x80;
+static const uint8_t SRPayloadLenMask   = 0x7F;
+
+
+- (void)_readFrameContinue;
+{
+    assert((_currentFrameCount == 0 && _currentFrameOpcode == 0) || (_currentFrameCount > 0 && _currentFrameOpcode > 0));
+
+    [self _addConsumerWithDataLength:2 callback:^(SRWebSocket *self, NSData *data) {
+        __block frame_header header = {0};
+        
+        const uint8_t *headerBuffer = data.bytes;
+        assert(data.length >= 2);
+        
+        if (headerBuffer[0] & SRRsvMask) {
+            [self _closeWithProtocolError:@"Server used RSV bits"];
+            return;
+        }
+        
+        uint8_t receivedOpcode = (SROpCodeMask & headerBuffer[0]);
+        
+        BOOL isControlFrame = (receivedOpcode == SROpCodePing || receivedOpcode == SROpCodePong || receivedOpcode == SROpCodeConnectionClose);
+        
+        if (!isControlFrame && receivedOpcode != 0 && self->_currentFrameCount > 0) {
+            [self _closeWithProtocolError:@"all data frames after the initial data frame must have opcode 0"];
+            return;
+        }
+        
+        if (receivedOpcode == 0 && self->_currentFrameCount == 0) {
+            [self _closeWithProtocolError:@"cannot continue a message"];
+            return;
+        }
+        
+        header.opcode = receivedOpcode == 0 ? self->_currentFrameOpcode : receivedOpcode;
+        
+        header.fin = !!(SRFinMask & headerBuffer[0]);
+        
+        
+        header.masked = !!(SRMaskMask & headerBuffer[1]);
+        header.payload_length = SRPayloadLenMask & headerBuffer[1];
+        
+        headerBuffer = NULL;
+        
+        if (header.masked) {
+            [self _closeWithProtocolError:@"Client must receive unmasked data"];
+        }
+        
+        size_t extra_bytes_needed = header.masked ? sizeof(_currentReadMaskKey) : 0;
+        
+        if (header.payload_length == 126) {
+            extra_bytes_needed += sizeof(uint16_t);
+        } else if (header.payload_length == 127) {
+            extra_bytes_needed += sizeof(uint64_t);
+        }
+        
+        if (extra_bytes_needed == 0) {
+            [self _handleFrameHeader:header curData:self->_currentFrameData];
+        } else {
+            [self _addConsumerWithDataLength:extra_bytes_needed callback:^(SRWebSocket *self, NSData *data) {
+                size_t mapped_size = data.length;
+                const void *mapped_buffer = data.bytes;
+                size_t offset = 0;
+                
+                if (header.payload_length == 126) {
+                    assert(mapped_size >= sizeof(uint16_t));
+                    uint16_t newLen = EndianU16_BtoN(*(uint16_t *)(mapped_buffer));
+                    header.payload_length = newLen;
+                    offset += sizeof(uint16_t);
+                } else if (header.payload_length == 127) {
+                    assert(mapped_size >= sizeof(uint64_t));
+                    header.payload_length = EndianU64_BtoN(*(uint64_t *)(mapped_buffer));
+                    offset += sizeof(uint64_t);
+                } else {
+                    assert(header.payload_length < 126 && header.payload_length >= 0);
+                }
+                
+                
+                if (header.masked) {
+                    assert(mapped_size >= sizeof(_currentReadMaskOffset) + offset);
+                    memcpy(self->_currentReadMaskKey, ((uint8_t *)mapped_buffer) + offset, sizeof(self->_currentReadMaskKey));
+                }
+                
+                [self _handleFrameHeader:header curData:self->_currentFrameData];
+            } readToCurrentFrame:NO unmaskBytes:NO];
+        }
+    } readToCurrentFrame:NO unmaskBytes:NO];
+}
+
+- (void)_readFrameNew;
+{
+    dispatch_async(_workQueue, ^{
+        [_currentFrameData setLength:0];
+        
+        _currentFrameOpcode = 0;
+        _currentFrameCount = 0;
+        _readOpCount = 0;
+        _currentStringScanPosition = 0;
+        
+        [self _readFrameContinue];
+    });
+}
+
+- (void)_pumpWriting;
+{
+    [self assertOnWorkQueue];
+    
+    NSUInteger dataLength = _outputBuffer.length;
+    if (dataLength - _outputBufferOffset > 0 && _outputStream.hasSpaceAvailable) {
+        NSInteger bytesWritten = [_outputStream write:_outputBuffer.bytes + _outputBufferOffset maxLength:dataLength - _outputBufferOffset];
+        if (bytesWritten == -1) {
+            [self _failWithError:[NSError errorWithDomain:SRWebSocketErrorDomain code:2145 userInfo:[NSDictionary dictionaryWithObject:@"Error writing to stream" forKey:NSLocalizedDescriptionKey]]];
+             return;
+        }
+        
+        _outputBufferOffset += bytesWritten;
+        
+        if (_outputBufferOffset > 4096 && _outputBufferOffset > (_outputBuffer.length >> 1)) {
+            _outputBuffer = [[NSMutableData alloc] initWithBytes:(char *)_outputBuffer.bytes + _outputBufferOffset length:_outputBuffer.length - _outputBufferOffset];
+            _outputBufferOffset = 0;
+        }
+    }
+    
+    if (_closeWhenFinishedWriting && 
+        _outputBuffer.length - _outputBufferOffset == 0 && 
+        (_inputStream.streamStatus != NSStreamStatusNotOpen &&
+         _inputStream.streamStatus != NSStreamStatusClosed) &&
+        !_sentClose) {
+        _sentClose = YES;
+            
+        [_outputStream close];
+        [_inputStream close];
+        
+        
+        for (NSArray *runLoop in [_scheduledRunloops copy]) {
+            [self unscheduleFromRunLoop:[runLoop objectAtIndex:0] forMode:[runLoop objectAtIndex:1]];
+        }
+        
+        if (!_failed) {
+            [self _performDelegateBlock:^{
+                if ([self.delegate respondsToSelector:@selector(webSocket:didCloseWithCode:reason:wasClean:)]) {
+                    [self.delegate webSocket:self didCloseWithCode:_closeCode reason:_closeReason wasClean:YES];
+                }
+            }];
+        }
+        
+        _selfRetain = nil;
+    }
+}
+
+- (void)_addConsumerWithScanner:(stream_scanner)consumer callback:(data_callback)callback;
+{
+    [self assertOnWorkQueue];
+    [self _addConsumerWithScanner:consumer callback:callback dataLength:0];
+}
+
+- (void)_addConsumerWithDataLength:(size_t)dataLength callback:(data_callback)callback readToCurrentFrame:(BOOL)readToCurrentFrame unmaskBytes:(BOOL)unmaskBytes;
+{   
+    [self assertOnWorkQueue];
+    assert(dataLength);
+    
+    [_consumers addObject:[_consumerPool consumerWithScanner:nil handler:callback bytesNeeded:dataLength readToCurrentFrame:readToCurrentFrame unmaskBytes:unmaskBytes]];
+    [self _pumpScanner];
+}
+
+- (void)_addConsumerWithScanner:(stream_scanner)consumer callback:(data_callback)callback dataLength:(size_t)dataLength;
+{    
+    [self assertOnWorkQueue];
+    [_consumers addObject:[_consumerPool consumerWithScanner:consumer handler:callback bytesNeeded:dataLength readToCurrentFrame:NO unmaskBytes:NO]];
+    [self _pumpScanner];
+}
+
+
+static const char CRLFCRLFBytes[] = {'\r', '\n', '\r', '\n'};
+
+- (void)_readUntilHeaderCompleteWithCallback:(data_callback)dataHandler;
+{
+    [self _readUntilBytes:CRLFCRLFBytes length:sizeof(CRLFCRLFBytes) callback:dataHandler];
+}
+
+- (void)_readUntilBytes:(const void *)bytes length:(size_t)length callback:(data_callback)dataHandler;
+{
+    // TODO optimize so this can continue from where we last searched
+    stream_scanner consumer = ^size_t(NSData *data) {
+        __block size_t found_size = 0;
+        __block size_t match_count = 0;
+        
+        size_t size = data.length;
+        const unsigned char *buffer = data.bytes;
+        for (size_t i = 0; i < size; i++ ) {
+            if (((const unsigned char *)buffer)[i] == ((const unsigned char *)bytes)[match_count]) {
+                match_count += 1;
+                if (match_count == length) {
+                    found_size = i + 1;
+                    break;
+                }
+            } else {
+                match_count = 0;
+            }
+        }
+        return found_size;
+    };
+    [self _addConsumerWithScanner:consumer callback:dataHandler];
+}
+
+
+// Returns true if did work
+- (BOOL)_innerPumpScanner {
+    
+    BOOL didWork = NO;
+    
+    if (self.readyState >= SR_CLOSING) {
+        return didWork;
+    }
+    
+    if (!_consumers.count) {
+        return didWork;
+    }
+    
+    size_t curSize = _readBuffer.length - _readBufferOffset;
+    if (!curSize) {
+        return didWork;
+    }
+    
+    SRIOConsumer *consumer = [_consumers objectAtIndex:0];
+    
+    size_t bytesNeeded = consumer.bytesNeeded;
+    
+    size_t foundSize = 0;
+    if (consumer.consumer) {
+        NSData *tempView = [NSData dataWithBytesNoCopy:(char *)_readBuffer.bytes + _readBufferOffset length:_readBuffer.length - _readBufferOffset freeWhenDone:NO];  
+        foundSize = consumer.consumer(tempView);
+    } else {
+        assert(consumer.bytesNeeded);
+        if (curSize >= bytesNeeded) {
+            foundSize = bytesNeeded;
+        } else if (consumer.readToCurrentFrame) {
+            foundSize = curSize;
+        }
+    }
+    
+    NSData *slice = nil;
+    if (consumer.readToCurrentFrame || foundSize) {
+        NSRange sliceRange = NSMakeRange(_readBufferOffset, foundSize);
+        slice = [_readBuffer subdataWithRange:sliceRange];
+        
+        _readBufferOffset += foundSize;
+        
+        if (_readBufferOffset > 4096 && _readBufferOffset > (_readBuffer.length >> 1)) {
+            _readBuffer = [[NSMutableData alloc] initWithBytes:(char *)_readBuffer.bytes + _readBufferOffset length:_readBuffer.length - _readBufferOffset];            _readBufferOffset = 0;
+        }
+        
+        if (consumer.unmaskBytes) {
+            NSMutableData *mutableSlice = [slice mutableCopy];
+            
+            NSUInteger len = mutableSlice.length;
+            uint8_t *bytes = mutableSlice.mutableBytes;
+            
+            for (NSUInteger i = 0; i < len; i++) {
+                bytes[i] = bytes[i] ^ _currentReadMaskKey[_currentReadMaskOffset % sizeof(_currentReadMaskKey)];
+                _currentReadMaskOffset += 1;
+            }
+            
+            slice = mutableSlice;
+        }
+        
+        if (consumer.readToCurrentFrame) {
+            [_currentFrameData appendData:slice];
+            
+            _readOpCount += 1;
+            
+            if (_currentFrameOpcode == SROpCodeTextFrame) {
+                // Validate UTF8 stuff.
+                size_t currentDataSize = _currentFrameData.length;
+                if (_currentFrameOpcode == SROpCodeTextFrame && currentDataSize > 0) {
+                    // TODO: Optimize the crap out of this.  Don't really have to copy all the data each time
+                    
+                    size_t scanSize = currentDataSize - _currentStringScanPosition;
+                    
+                    NSData *scan_data = [_currentFrameData subdataWithRange:NSMakeRange(_currentStringScanPosition, scanSize)];
+                    int32_t valid_utf8_size = validate_dispatch_data_partial_string(scan_data);
+                    
+                    if (valid_utf8_size == -1) {
+                        [self closeWithCode:SRStatusCodeInvalidUTF8 reason:@"Text frames must be valid UTF-8"];
+                        dispatch_async(_workQueue, ^{
+                            [self _disconnect];
+                        });
+                        return didWork;
+                    } else {
+                        _currentStringScanPosition += valid_utf8_size;
+                    }
+                } 
+                
+            }
+            
+            consumer.bytesNeeded -= foundSize;
+            
+            if (consumer.bytesNeeded == 0) {
+                [_consumers removeObjectAtIndex:0];
+                consumer.handler(self, nil);
+                [_consumerPool returnConsumer:consumer];
+                didWork = YES;
+            }
+        } else if (foundSize) {
+            [_consumers removeObjectAtIndex:0];
+            consumer.handler(self, slice);
+            [_consumerPool returnConsumer:consumer];
+            didWork = YES;
+        }
+    }
+    return didWork;
+}
+
+-(void)_pumpScanner;
+{
+    [self assertOnWorkQueue];
+    
+    if (!_isPumping) {
+        _isPumping = YES;
+    } else {
+        return;
+    }
+    
+    while ([self _innerPumpScanner]) {
+        
+    }
+    
+    _isPumping = NO;
+}
+
+//#define NOMASK
+
+static const size_t SRFrameHeaderOverhead = 32;
+
+- (void)_sendFrameWithOpcode:(SROpCode)opcode data:(id)data;
+{
+    [self assertOnWorkQueue];
+    
+    if (nil == data) {
+        return;
+    }
+    
+    NSAssert([data isKindOfClass:[NSData class]] || [data isKindOfClass:[NSString class]], @"NSString or NSData");
+    
+    size_t payloadLength = [data isKindOfClass:[NSString class]] ? [(NSString *)data lengthOfBytesUsingEncoding:NSUTF8StringEncoding] : [data length];
+        
+    NSMutableData *frame = [[NSMutableData alloc] initWithLength:payloadLength + SRFrameHeaderOverhead];
+    if (!frame) {
+        [self closeWithCode:SRStatusCodeMessageTooBig reason:@"Message too big"];
+        return;
+    }
+    uint8_t *frame_buffer = (uint8_t *)[frame mutableBytes];
+    
+    // set fin
+    frame_buffer[0] = SRFinMask | opcode;
+    
+    BOOL useMask = YES;
+#ifdef NOMASK
+    useMask = NO;
+#endif
+    
+    if (useMask) {
+    // set the mask and header
+        frame_buffer[1] |= SRMaskMask;
+    }
+    
+    size_t frame_buffer_size = 2;
+    
+    const uint8_t *unmasked_payload = NULL;
+    if ([data isKindOfClass:[NSData class]]) {
+        unmasked_payload = (uint8_t *)[data bytes];
+    } else if ([data isKindOfClass:[NSString class]]) {
+        unmasked_payload =  (const uint8_t *)[data UTF8String];
+    } else {
+        return;
+    }
+    
+    if (payloadLength < 126) {
+        frame_buffer[1] |= payloadLength;
+    } else if (payloadLength <= UINT16_MAX) {
+        frame_buffer[1] |= 126;
+        *((uint16_t *)(frame_buffer + frame_buffer_size)) = EndianU16_BtoN((uint16_t)payloadLength);
+        frame_buffer_size += sizeof(uint16_t);
+    } else {
+        frame_buffer[1] |= 127;
+        *((uint64_t *)(frame_buffer + frame_buffer_size)) = EndianU64_BtoN((uint64_t)payloadLength);
+        frame_buffer_size += sizeof(uint64_t);
+    }
+        
+    if (!useMask) {
+        for (size_t i = 0; i < payloadLength; i++) {
+            frame_buffer[frame_buffer_size] = unmasked_payload[i];
+            frame_buffer_size += 1;
+        }
+    } else {
+        uint8_t *mask_key = frame_buffer + frame_buffer_size;
+        SecRandomCopyBytes(kSecRandomDefault, sizeof(uint32_t), (uint8_t *)mask_key);
+        frame_buffer_size += sizeof(uint32_t);
+        
+        // TODO: could probably optimize this with SIMD
+        for (size_t i = 0; i < payloadLength; i++) {
+            frame_buffer[frame_buffer_size] = unmasked_payload[i] ^ mask_key[i % sizeof(uint32_t)];
+            frame_buffer_size += 1;
+        }
+    }
+
+    assert(frame_buffer_size <= [frame length]);
+    frame.length = frame_buffer_size;
+    
+    [self _writeData:frame];
+}
+
+- (void)stream:(NSStream *)aStream handleEvent:(NSStreamEvent)eventCode;
+{
+    if (_secure && !_pinnedCertFound && (eventCode == NSStreamEventHasBytesAvailable || eventCode == NSStreamEventHasSpaceAvailable)) {
+        
+        NSArray *sslCerts = [_urlRequest SR_SSLPinnedCertificates];
+        if (sslCerts) {
+            SecTrustRef secTrust = (__bridge SecTrustRef)[aStream propertyForKey:(__bridge id)kCFStreamPropertySSLPeerTrust];
+            if (secTrust) {
+                NSInteger numCerts = SecTrustGetCertificateCount(secTrust);
+                for (NSInteger i = 0; i < numCerts && !_pinnedCertFound; i++) {
+                    SecCertificateRef cert = SecTrustGetCertificateAtIndex(secTrust, i);
+                    NSData *certData = CFBridgingRelease(SecCertificateCopyData(cert));
+                    
+                    for (id ref in sslCerts) {
+                        SecCertificateRef trustedCert = (__bridge SecCertificateRef)ref;
+                        NSData *trustedCertData = CFBridgingRelease(SecCertificateCopyData(trustedCert));
+                        
+                        if ([trustedCertData isEqualToData:certData]) {
+                            _pinnedCertFound = YES;
+                            break;
+                        }
+                    }
+                }
+            }
+            
+            if (!_pinnedCertFound) {
+                dispatch_async(_workQueue, ^{
+                    [self _failWithError:[NSError errorWithDomain:SRWebSocketErrorDomain code:23556 userInfo:[NSDictionary dictionaryWithObject:[NSString stringWithFormat:@"Invalid server cert"] forKey:NSLocalizedDescriptionKey]]];
+                });
+                return;
+            }
+        }
+    }
+
+    dispatch_async(_workQueue, ^{
+        switch (eventCode) {
+            case NSStreamEventOpenCompleted: {
+                SRFastLog(@"NSStreamEventOpenCompleted %@", aStream);
+                if (self.readyState >= SR_CLOSING) {
+                    return;
+                }
+                assert(_readBuffer);
+                
+                if (self.readyState == SR_CONNECTING && aStream == _inputStream) {
+                    [self didConnect];
+                }
+                [self _pumpWriting];
+                [self _pumpScanner];
+                break;
+            }
+                
+            case NSStreamEventErrorOccurred: {
+                SRFastLog(@"NSStreamEventErrorOccurred %@ %@", aStream, [[aStream streamError] copy]);
+                /// TODO specify error better!
+                [self _failWithError:aStream.streamError];
+                _readBufferOffset = 0;
+                [_readBuffer setLength:0];
+                break;
+                
+            }
+                
+            case NSStreamEventEndEncountered: {
+                [self _pumpScanner];
+                SRFastLog(@"NSStreamEventEndEncountered %@", aStream);
+                if (aStream.streamError) {
+                    [self _failWithError:aStream.streamError];
+                } else {
+                    if (self.readyState != SR_CLOSED) {
+                        self.readyState = SR_CLOSED;
+                        _selfRetain = nil;
+                    }
+
+                    if (!_sentClose && !_failed) {
+                        _sentClose = YES;
+                        // If we get closed in this state it's probably not clean because we should be sending this when we send messages
+                        [self _performDelegateBlock:^{
+                            if ([self.delegate respondsToSelector:@selector(webSocket:didCloseWithCode:reason:wasClean:)]) {
+                                [self.delegate webSocket:self didCloseWithCode:SRStatusCodeGoingAway reason:@"Stream end encountered" wasClean:NO];
+                            }
+                        }];
+                    }
+                }
+                
+                break;
+            }
+                
+            case NSStreamEventHasBytesAvailable: {
+                SRFastLog(@"NSStreamEventHasBytesAvailable %@", aStream);
+                const int bufferSize = 2048;
+                uint8_t buffer[bufferSize];
+                
+                while (_inputStream.hasBytesAvailable) {
+                    NSInteger bytes_read = [_inputStream read:buffer maxLength:bufferSize];
+                    
+                    if (bytes_read > 0) {
+                        [_readBuffer appendBytes:buffer length:bytes_read];
+                    } else if (bytes_read < 0) {
+                        [self _failWithError:_inputStream.streamError];
+                    }
+                    
+                    if (bytes_read != bufferSize) {
+                        break;
+                    }
+                };
+                [self _pumpScanner];
+                break;
+            }
+                
+            case NSStreamEventHasSpaceAvailable: {
+                SRFastLog(@"NSStreamEventHasSpaceAvailable %@", aStream);
+                [self _pumpWriting];
+                break;
+            }
+                
+            default:
+                SRFastLog(@"(default)  %@", aStream);
+                break;
+        }
+    });
+}
+
+@end
+
+
+@implementation SRIOConsumer
+
+@synthesize bytesNeeded = _bytesNeeded;
+@synthesize consumer = _scanner;
+@synthesize handler = _handler;
+@synthesize readToCurrentFrame = _readToCurrentFrame;
+@synthesize unmaskBytes = _unmaskBytes;
+
+- (void)setupWithScanner:(stream_scanner)scanner handler:(data_callback)handler bytesNeeded:(size_t)bytesNeeded readToCurrentFrame:(BOOL)readToCurrentFrame unmaskBytes:(BOOL)unmaskBytes;
+{
+    _scanner = [scanner copy];
+    _handler = [handler copy];
+    _bytesNeeded = bytesNeeded;
+    _readToCurrentFrame = readToCurrentFrame;
+    _unmaskBytes = unmaskBytes;
+    assert(_scanner || _bytesNeeded);
+}
+
+
+@end
+
+
+@implementation SRIOConsumerPool {
+    NSUInteger _poolSize;
+    NSMutableArray *_bufferedConsumers;
+}
+
+- (id)initWithBufferCapacity:(NSUInteger)poolSize;
+{
+    self = [super init];
+    if (self) {
+        _poolSize = poolSize;
+        _bufferedConsumers = [[NSMutableArray alloc] initWithCapacity:poolSize];
+    }
+    return self;
+}
+
+- (id)init
+{
+    return [self initWithBufferCapacity:8];
+}
+
+- (SRIOConsumer *)consumerWithScanner:(stream_scanner)scanner handler:(data_callback)handler bytesNeeded:(size_t)bytesNeeded readToCurrentFrame:(BOOL)readToCurrentFrame unmaskBytes:(BOOL)unmaskBytes;
+{
+    SRIOConsumer *consumer = nil;
+    if (_bufferedConsumers.count) {
+        consumer = [_bufferedConsumers lastObject];
+        [_bufferedConsumers removeLastObject];
+    } else {
+        consumer = [[SRIOConsumer alloc] init];
+    }
+    
+    [consumer setupWithScanner:scanner handler:handler bytesNeeded:bytesNeeded readToCurrentFrame:readToCurrentFrame unmaskBytes:unmaskBytes];
+    
+    return consumer;
+}
+
+- (void)returnConsumer:(SRIOConsumer *)consumer;
+{
+    if (_bufferedConsumers.count < _poolSize) {
+        [_bufferedConsumers addObject:consumer];
+    }
+}
+
+@end
+
+
+@implementation  NSURLRequest (CertificateAdditions)
+
+- (NSArray *)SR_SSLPinnedCertificates;
+{
+    return [NSURLProtocol propertyForKey:@"SR_SSLPinnedCertificates" inRequest:self];
+}
+
+@end
+
+@implementation  NSMutableURLRequest (CertificateAdditions)
+
+- (NSArray *)SR_SSLPinnedCertificates;
+{
+    return [NSURLProtocol propertyForKey:@"SR_SSLPinnedCertificates" inRequest:self];
+}
+
+- (void)setSR_SSLPinnedCertificates:(NSArray *)SR_SSLPinnedCertificates;
+{
+    [NSURLProtocol setProperty:SR_SSLPinnedCertificates forKey:@"SR_SSLPinnedCertificates" inRequest:self];
+}
+
+@end
+
+@implementation NSURL (SRWebSocket)
+
+- (NSString *)SR_origin;
+{
+    NSString *scheme = [self.scheme lowercaseString];
+        
+    if ([scheme isEqualToString:@"wss"]) {
+        scheme = @"https";
+    } else if ([scheme isEqualToString:@"ws"]) {
+        scheme = @"http";
+    }
+    
+    if (self.port) {
+        return [NSString stringWithFormat:@"%@://%@:%@/", scheme, self.host, self.port];
+    } else {
+        return [NSString stringWithFormat:@"%@://%@/", scheme, self.host];
+    }
+}
+
+@end
+
+//#define SR_ENABLE_LOG
+
+static inline void SRFastLog(NSString *format, ...)  {
+#ifdef SR_ENABLE_LOG
+    __block va_list arg_list;
+    va_start (arg_list, format);
+    
+    NSString *formattedString = [[NSString alloc] initWithFormat:format arguments:arg_list];
+    
+    va_end(arg_list);
+    
+    NSLog(@"[SR] %@", formattedString);
+#endif
+}
+
+
+#ifdef HAS_ICU
+
+static inline int32_t validate_dispatch_data_partial_string(NSData *data) {
+    if ([data length] > INT32_MAX) {
+        // INT32_MAX is the limit so long as this Framework is using 32 bit ints everywhere.
+        return -1;
+    }
+
+    int32_t size = (int32_t)[data length];
+
+    const void * contents = [data bytes];
+    const uint8_t *str = (const uint8_t *)contents;
+    
+    UChar32 codepoint = 1;
+    int32_t offset = 0;
+    int32_t lastOffset = 0;
+    while(offset < size && codepoint > 0)  {
+        lastOffset = offset;
+        U8_NEXT(str, offset, size, codepoint);
+    }
+    
+    if (codepoint == -1) {
+        // Check to see if the last byte is valid or whether it was just continuing
+        if (!U8_IS_LEAD(str[lastOffset]) || U8_COUNT_TRAIL_BYTES(str[lastOffset]) + lastOffset < (int32_t)size) {
+            
+            size = -1;
+        } else {
+            uint8_t leadByte = str[lastOffset];
+            U8_MASK_LEAD_BYTE(leadByte, U8_COUNT_TRAIL_BYTES(leadByte));
+            
+            for (int i = lastOffset + 1; i < offset; i++) {
+                if (U8_IS_SINGLE(str[i]) || U8_IS_LEAD(str[i]) || !U8_IS_TRAIL(str[i])) {
+                    size = -1;
+                }
+            }
+            
+            if (size != -1) {
+                size = lastOffset;
+            }
+        }
+    }
+    
+    if (size != -1 && ![[NSString alloc] initWithBytesNoCopy:(char *)[data bytes] length:size encoding:NSUTF8StringEncoding freeWhenDone:NO]) {
+        size = -1;
+    }
+    
+    return size;
+}
+
+#else
+
+// This is a hack, and probably not optimal
+static inline int32_t validate_dispatch_data_partial_string(NSData *data) {
+    static const int maxCodepointSize = 3;
+    
+    for (int i = 0; i < maxCodepointSize; i++) {
+        NSString *str = [[NSString alloc] initWithBytesNoCopy:(char *)data.bytes length:data.length - i encoding:NSUTF8StringEncoding freeWhenDone:NO];
+        if (str) {
+            return data.length - i;
+        }
+    }
+    
+    return -1;
+}
+
+#endif
+
+static _SRRunLoopThread *networkThread = nil;
+static NSRunLoop *networkRunLoop = nil;
+
+@implementation NSRunLoop (SRWebSocket)
+
++ (NSRunLoop *)SR_networkRunLoop {
+    static dispatch_once_t onceToken;
+    dispatch_once(&onceToken, ^{
+        networkThread = [[_SRRunLoopThread alloc] init];
+        networkThread.name = @"com.squareup.SocketRocket.NetworkThread";
+        [networkThread start];
+        networkRunLoop = networkThread.runLoop;
+    });
+    
+    return networkRunLoop;
+}
+
+@end
+
+
+@implementation _SRRunLoopThread {
+    dispatch_group_t _waitGroup;
+}
+
+@synthesize runLoop = _runLoop;
+
+- (void)dealloc
+{
+    sr_dispatch_release(_waitGroup);
+}
+
+- (id)init
+{
+    self = [super init];
+    if (self) {
+        _waitGroup = dispatch_group_create();
+        dispatch_group_enter(_waitGroup);
+    }
+    return self;
+}
+
+- (void)main;
+{
+    @autoreleasepool {
+        _runLoop = [NSRunLoop currentRunLoop];
+        dispatch_group_leave(_waitGroup);
+        
+        NSTimer *timer = [[NSTimer alloc] initWithFireDate:[NSDate distantFuture] interval:0.0 target:nil selector:nil userInfo:nil repeats:NO];
+        [_runLoop addTimer:timer forMode:NSDefaultRunLoopMode];
+        
+        while ([_runLoop runMode:NSDefaultRunLoopMode beforeDate:[NSDate distantFuture]]) {
+            
+        }
+        assert(NO);
+    }
+}
+
+- (NSRunLoop *)runLoop;
+{
+    dispatch_group_wait(_waitGroup, DISPATCH_TIME_FOREVER);
+    return _runLoop;
+}
+
+@end
diff --git a/examples/objc/Icon.png b/examples/objc/Icon.png
new file mode 100644
index 0000000..55773ca
--- /dev/null
+++ b/examples/objc/Icon.png
Binary files differ
diff --git a/examples/objc/README b/examples/objc/README
new file mode 100644
index 0000000..bfe18b3
--- /dev/null
+++ b/examples/objc/README
@@ -0,0 +1,3 @@
+This directory contains sample iOS and mac clients for http://apprtc.appspot.com
+
+See ../../app/webrtc/objc/README for information on how to use it.
diff --git a/examples/peerconnection/client/conductor.cc b/examples/peerconnection/client/conductor.cc
new file mode 100644
index 0000000..e3def99
--- /dev/null
+++ b/examples/peerconnection/client/conductor.cc
@@ -0,0 +1,546 @@
+/*
+ *  Copyright 2012 The WebRTC Project Authors. All rights reserved.
+ *
+ *  Use of this source code is governed by a BSD-style license
+ *  that can be found in the LICENSE file in the root of the source
+ *  tree. An additional intellectual property rights grant can be found
+ *  in the file PATENTS.  All contributing project authors may
+ *  be found in the AUTHORS file in the root of the source tree.
+ */
+
+#include "webrtc/examples/peerconnection/client/conductor.h"
+
+#include <utility>
+#include <vector>
+
+#include "talk/app/webrtc/videosourceinterface.h"
+#include "webrtc/examples/peerconnection/client/defaults.h"
+#include "talk/media/devices/devicemanager.h"
+#include "talk/app/webrtc/test/fakeconstraints.h"
+#include "webrtc/base/common.h"
+#include "webrtc/base/json.h"
+#include "webrtc/base/logging.h"
+
+// Names used for a IceCandidate JSON object.
+const char kCandidateSdpMidName[] = "sdpMid";
+const char kCandidateSdpMlineIndexName[] = "sdpMLineIndex";
+const char kCandidateSdpName[] = "candidate";
+
+// Names used for a SessionDescription JSON object.
+const char kSessionDescriptionTypeName[] = "type";
+const char kSessionDescriptionSdpName[] = "sdp";
+
+#define DTLS_ON  true
+#define DTLS_OFF false
+
+class DummySetSessionDescriptionObserver
+    : public webrtc::SetSessionDescriptionObserver {
+ public:
+  static DummySetSessionDescriptionObserver* Create() {
+    return
+        new rtc::RefCountedObject<DummySetSessionDescriptionObserver>();
+  }
+  virtual void OnSuccess() {
+    LOG(INFO) << __FUNCTION__;
+  }
+  virtual void OnFailure(const std::string& error) {
+    LOG(INFO) << __FUNCTION__ << " " << error;
+  }
+
+ protected:
+  DummySetSessionDescriptionObserver() {}
+  ~DummySetSessionDescriptionObserver() {}
+};
+
+Conductor::Conductor(PeerConnectionClient* client, MainWindow* main_wnd)
+  : peer_id_(-1),
+    loopback_(false),
+    client_(client),
+    main_wnd_(main_wnd) {
+  client_->RegisterObserver(this);
+  main_wnd->RegisterObserver(this);
+}
+
+Conductor::~Conductor() {
+  ASSERT(peer_connection_.get() == NULL);
+}
+
+bool Conductor::connection_active() const {
+  return peer_connection_.get() != NULL;
+}
+
+void Conductor::Close() {
+  client_->SignOut();
+  DeletePeerConnection();
+}
+
+bool Conductor::InitializePeerConnection() {
+  ASSERT(peer_connection_factory_.get() == NULL);
+  ASSERT(peer_connection_.get() == NULL);
+
+  peer_connection_factory_  = webrtc::CreatePeerConnectionFactory();
+
+  if (!peer_connection_factory_.get()) {
+    main_wnd_->MessageBox("Error",
+        "Failed to initialize PeerConnectionFactory", true);
+    DeletePeerConnection();
+    return false;
+  }
+
+  if (!CreatePeerConnection(DTLS_ON)) {
+    main_wnd_->MessageBox("Error",
+        "CreatePeerConnection failed", true);
+    DeletePeerConnection();
+  }
+  AddStreams();
+  return peer_connection_.get() != NULL;
+}
+
+bool Conductor::ReinitializePeerConnectionForLoopback() {
+  loopback_ = true;
+  rtc::scoped_refptr<webrtc::StreamCollectionInterface> streams(
+      peer_connection_->local_streams());
+  peer_connection_ = NULL;
+  if (CreatePeerConnection(DTLS_OFF)) {
+    for (size_t i = 0; i < streams->count(); ++i)
+      peer_connection_->AddStream(streams->at(i));
+    peer_connection_->CreateOffer(this, NULL);
+  }
+  return peer_connection_.get() != NULL;
+}
+
+bool Conductor::CreatePeerConnection(bool dtls) {
+  ASSERT(peer_connection_factory_.get() != NULL);
+  ASSERT(peer_connection_.get() == NULL);
+
+  webrtc::PeerConnectionInterface::IceServers servers;
+  webrtc::PeerConnectionInterface::IceServer server;
+  server.uri = GetPeerConnectionString();
+  servers.push_back(server);
+
+  webrtc::FakeConstraints constraints;
+  if (dtls) {
+    constraints.AddOptional(webrtc::MediaConstraintsInterface::kEnableDtlsSrtp,
+                            "true");
+  }
+  else
+  {
+    constraints.AddOptional(webrtc::MediaConstraintsInterface::kEnableDtlsSrtp,
+                            "false");
+  }
+
+  peer_connection_ =
+      peer_connection_factory_->CreatePeerConnection(servers,
+                                                     &constraints,
+                                                     NULL,
+                                                     NULL,
+                                                     this);
+  return peer_connection_.get() != NULL;
+}
+
+void Conductor::DeletePeerConnection() {
+  peer_connection_ = NULL;
+  active_streams_.clear();
+  main_wnd_->StopLocalRenderer();
+  main_wnd_->StopRemoteRenderer();
+  peer_connection_factory_ = NULL;
+  peer_id_ = -1;
+  loopback_ = false;
+}
+
+void Conductor::EnsureStreamingUI() {
+  ASSERT(peer_connection_.get() != NULL);
+  if (main_wnd_->IsWindow()) {
+    if (main_wnd_->current_ui() != MainWindow::STREAMING)
+      main_wnd_->SwitchToStreamingUI();
+  }
+}
+
+//
+// PeerConnectionObserver implementation.
+//
+
+// Called when a remote stream is added
+void Conductor::OnAddStream(webrtc::MediaStreamInterface* stream) {
+  LOG(INFO) << __FUNCTION__ << " " << stream->label();
+
+  stream->AddRef();
+  main_wnd_->QueueUIThreadCallback(NEW_STREAM_ADDED,
+                                   stream);
+}
+
+void Conductor::OnRemoveStream(webrtc::MediaStreamInterface* stream) {
+  LOG(INFO) << __FUNCTION__ << " " << stream->label();
+  stream->AddRef();
+  main_wnd_->QueueUIThreadCallback(STREAM_REMOVED,
+                                   stream);
+}
+
+void Conductor::OnIceCandidate(const webrtc::IceCandidateInterface* candidate) {
+  LOG(INFO) << __FUNCTION__ << " " << candidate->sdp_mline_index();
+  // For loopback test. To save some connecting delay.
+  if (loopback_) {
+    if (!peer_connection_->AddIceCandidate(candidate)) {
+      LOG(WARNING) << "Failed to apply the received candidate";
+    }
+    return;
+  }
+
+  Json::StyledWriter writer;
+  Json::Value jmessage;
+
+  jmessage[kCandidateSdpMidName] = candidate->sdp_mid();
+  jmessage[kCandidateSdpMlineIndexName] = candidate->sdp_mline_index();
+  std::string sdp;
+  if (!candidate->ToString(&sdp)) {
+    LOG(LS_ERROR) << "Failed to serialize candidate";
+    return;
+  }
+  jmessage[kCandidateSdpName] = sdp;
+  SendMessage(writer.write(jmessage));
+}
+
+//
+// PeerConnectionClientObserver implementation.
+//
+
+void Conductor::OnSignedIn() {
+  LOG(INFO) << __FUNCTION__;
+  main_wnd_->SwitchToPeerList(client_->peers());
+}
+
+void Conductor::OnDisconnected() {
+  LOG(INFO) << __FUNCTION__;
+
+  DeletePeerConnection();
+
+  if (main_wnd_->IsWindow())
+    main_wnd_->SwitchToConnectUI();
+}
+
+void Conductor::OnPeerConnected(int id, const std::string& name) {
+  LOG(INFO) << __FUNCTION__;
+  // Refresh the list if we're showing it.
+  if (main_wnd_->current_ui() == MainWindow::LIST_PEERS)
+    main_wnd_->SwitchToPeerList(client_->peers());
+}
+
+void Conductor::OnPeerDisconnected(int id) {
+  LOG(INFO) << __FUNCTION__;
+  if (id == peer_id_) {
+    LOG(INFO) << "Our peer disconnected";
+    main_wnd_->QueueUIThreadCallback(PEER_CONNECTION_CLOSED, NULL);
+  } else {
+    // Refresh the list if we're showing it.
+    if (main_wnd_->current_ui() == MainWindow::LIST_PEERS)
+      main_wnd_->SwitchToPeerList(client_->peers());
+  }
+}
+
+void Conductor::OnMessageFromPeer(int peer_id, const std::string& message) {
+  ASSERT(peer_id_ == peer_id || peer_id_ == -1);
+  ASSERT(!message.empty());
+
+  if (!peer_connection_.get()) {
+    ASSERT(peer_id_ == -1);
+    peer_id_ = peer_id;
+
+    if (!InitializePeerConnection()) {
+      LOG(LS_ERROR) << "Failed to initialize our PeerConnection instance";
+      client_->SignOut();
+      return;
+    }
+  } else if (peer_id != peer_id_) {
+    ASSERT(peer_id_ != -1);
+    LOG(WARNING) << "Received a message from unknown peer while already in a "
+                    "conversation with a different peer.";
+    return;
+  }
+
+  Json::Reader reader;
+  Json::Value jmessage;
+  if (!reader.parse(message, jmessage)) {
+    LOG(WARNING) << "Received unknown message. " << message;
+    return;
+  }
+  std::string type;
+  std::string json_object;
+
+  rtc::GetStringFromJsonObject(jmessage, kSessionDescriptionTypeName, &type);
+  if (!type.empty()) {
+    if (type == "offer-loopback") {
+      // This is a loopback call.
+      // Recreate the peerconnection with DTLS disabled.
+      if (!ReinitializePeerConnectionForLoopback()) {
+        LOG(LS_ERROR) << "Failed to initialize our PeerConnection instance";
+        DeletePeerConnection();
+        client_->SignOut();
+      }
+      return;
+    }
+
+    std::string sdp;
+    if (!rtc::GetStringFromJsonObject(jmessage, kSessionDescriptionSdpName,
+                                      &sdp)) {
+      LOG(WARNING) << "Can't parse received session description message.";
+      return;
+    }
+    webrtc::SdpParseError error;
+    webrtc::SessionDescriptionInterface* session_description(
+        webrtc::CreateSessionDescription(type, sdp, &error));
+    if (!session_description) {
+      LOG(WARNING) << "Can't parse received session description message. "
+          << "SdpParseError was: " << error.description;
+      return;
+    }
+    LOG(INFO) << " Received session description :" << message;
+    peer_connection_->SetRemoteDescription(
+        DummySetSessionDescriptionObserver::Create(), session_description);
+    if (session_description->type() ==
+        webrtc::SessionDescriptionInterface::kOffer) {
+      peer_connection_->CreateAnswer(this, NULL);
+    }
+    return;
+  } else {
+    std::string sdp_mid;
+    int sdp_mlineindex = 0;
+    std::string sdp;
+    if (!rtc::GetStringFromJsonObject(jmessage, kCandidateSdpMidName,
+                                      &sdp_mid) ||
+        !rtc::GetIntFromJsonObject(jmessage, kCandidateSdpMlineIndexName,
+                                   &sdp_mlineindex) ||
+        !rtc::GetStringFromJsonObject(jmessage, kCandidateSdpName, &sdp)) {
+      LOG(WARNING) << "Can't parse received message.";
+      return;
+    }
+    webrtc::SdpParseError error;
+    rtc::scoped_ptr<webrtc::IceCandidateInterface> candidate(
+        webrtc::CreateIceCandidate(sdp_mid, sdp_mlineindex, sdp, &error));
+    if (!candidate.get()) {
+      LOG(WARNING) << "Can't parse received candidate message. "
+          << "SdpParseError was: " << error.description;
+      return;
+    }
+    if (!peer_connection_->AddIceCandidate(candidate.get())) {
+      LOG(WARNING) << "Failed to apply the received candidate";
+      return;
+    }
+    LOG(INFO) << " Received candidate :" << message;
+    return;
+  }
+}
+
+void Conductor::OnMessageSent(int err) {
+  // Process the next pending message if any.
+  main_wnd_->QueueUIThreadCallback(SEND_MESSAGE_TO_PEER, NULL);
+}
+
+void Conductor::OnServerConnectionFailure() {
+    main_wnd_->MessageBox("Error", ("Failed to connect to " + server_).c_str(),
+                          true);
+}
+
+//
+// MainWndCallback implementation.
+//
+
+void Conductor::StartLogin(const std::string& server, int port) {
+  if (client_->is_connected())
+    return;
+  server_ = server;
+  client_->Connect(server, port, GetPeerName());
+}
+
+void Conductor::DisconnectFromServer() {
+  if (client_->is_connected())
+    client_->SignOut();
+}
+
+void Conductor::ConnectToPeer(int peer_id) {
+  ASSERT(peer_id_ == -1);
+  ASSERT(peer_id != -1);
+
+  if (peer_connection_.get()) {
+    main_wnd_->MessageBox("Error",
+        "We only support connecting to one peer at a time", true);
+    return;
+  }
+
+  if (InitializePeerConnection()) {
+    peer_id_ = peer_id;
+    peer_connection_->CreateOffer(this, NULL);
+  } else {
+    main_wnd_->MessageBox("Error", "Failed to initialize PeerConnection", true);
+  }
+}
+
+cricket::VideoCapturer* Conductor::OpenVideoCaptureDevice() {
+  rtc::scoped_ptr<cricket::DeviceManagerInterface> dev_manager(
+      cricket::DeviceManagerFactory::Create());
+  if (!dev_manager->Init()) {
+    LOG(LS_ERROR) << "Can't create device manager";
+    return NULL;
+  }
+  std::vector<cricket::Device> devs;
+  if (!dev_manager->GetVideoCaptureDevices(&devs)) {
+    LOG(LS_ERROR) << "Can't enumerate video devices";
+    return NULL;
+  }
+  std::vector<cricket::Device>::iterator dev_it = devs.begin();
+  cricket::VideoCapturer* capturer = NULL;
+  for (; dev_it != devs.end(); ++dev_it) {
+    capturer = dev_manager->CreateVideoCapturer(*dev_it);
+    if (capturer != NULL)
+      break;
+  }
+  return capturer;
+}
+
+void Conductor::AddStreams() {
+  if (active_streams_.find(kStreamLabel) != active_streams_.end())
+    return;  // Already added.
+
+  rtc::scoped_refptr<webrtc::AudioTrackInterface> audio_track(
+      peer_connection_factory_->CreateAudioTrack(
+          kAudioLabel, peer_connection_factory_->CreateAudioSource(NULL)));
+
+  rtc::scoped_refptr<webrtc::VideoTrackInterface> video_track(
+      peer_connection_factory_->CreateVideoTrack(
+          kVideoLabel,
+          peer_connection_factory_->CreateVideoSource(OpenVideoCaptureDevice(),
+                                                      NULL)));
+  main_wnd_->StartLocalRenderer(video_track);
+
+  rtc::scoped_refptr<webrtc::MediaStreamInterface> stream =
+      peer_connection_factory_->CreateLocalMediaStream(kStreamLabel);
+
+  stream->AddTrack(audio_track);
+  stream->AddTrack(video_track);
+  if (!peer_connection_->AddStream(stream)) {
+    LOG(LS_ERROR) << "Adding stream to PeerConnection failed";
+  }
+  typedef std::pair<std::string,
+                    rtc::scoped_refptr<webrtc::MediaStreamInterface> >
+      MediaStreamPair;
+  active_streams_.insert(MediaStreamPair(stream->label(), stream));
+  main_wnd_->SwitchToStreamingUI();
+}
+
+void Conductor::DisconnectFromCurrentPeer() {
+  LOG(INFO) << __FUNCTION__;
+  if (peer_connection_.get()) {
+    client_->SendHangUp(peer_id_);
+    DeletePeerConnection();
+  }
+
+  if (main_wnd_->IsWindow())
+    main_wnd_->SwitchToPeerList(client_->peers());
+}
+
+void Conductor::UIThreadCallback(int msg_id, void* data) {
+  switch (msg_id) {
+    case PEER_CONNECTION_CLOSED:
+      LOG(INFO) << "PEER_CONNECTION_CLOSED";
+      DeletePeerConnection();
+
+      ASSERT(active_streams_.empty());
+
+      if (main_wnd_->IsWindow()) {
+        if (client_->is_connected()) {
+          main_wnd_->SwitchToPeerList(client_->peers());
+        } else {
+          main_wnd_->SwitchToConnectUI();
+        }
+      } else {
+        DisconnectFromServer();
+      }
+      break;
+
+    case SEND_MESSAGE_TO_PEER: {
+      LOG(INFO) << "SEND_MESSAGE_TO_PEER";
+      std::string* msg = reinterpret_cast<std::string*>(data);
+      if (msg) {
+        // For convenience, we always run the message through the queue.
+        // This way we can be sure that messages are sent to the server
+        // in the same order they were signaled without much hassle.
+        pending_messages_.push_back(msg);
+      }
+
+      if (!pending_messages_.empty() && !client_->IsSendingMessage()) {
+        msg = pending_messages_.front();
+        pending_messages_.pop_front();
+
+        if (!client_->SendToPeer(peer_id_, *msg) && peer_id_ != -1) {
+          LOG(LS_ERROR) << "SendToPeer failed";
+          DisconnectFromServer();
+        }
+        delete msg;
+      }
+
+      if (!peer_connection_.get())
+        peer_id_ = -1;
+
+      break;
+    }
+
+    case NEW_STREAM_ADDED: {
+      webrtc::MediaStreamInterface* stream =
+          reinterpret_cast<webrtc::MediaStreamInterface*>(
+          data);
+      webrtc::VideoTrackVector tracks = stream->GetVideoTracks();
+      // Only render the first track.
+      if (!tracks.empty()) {
+        webrtc::VideoTrackInterface* track = tracks[0];
+        main_wnd_->StartRemoteRenderer(track);
+      }
+      stream->Release();
+      break;
+    }
+
+    case STREAM_REMOVED: {
+      // Remote peer stopped sending a stream.
+      webrtc::MediaStreamInterface* stream =
+          reinterpret_cast<webrtc::MediaStreamInterface*>(
+          data);
+      stream->Release();
+      break;
+    }
+
+    default:
+      ASSERT(false);
+      break;
+  }
+}
+
+void Conductor::OnSuccess(webrtc::SessionDescriptionInterface* desc) {
+  peer_connection_->SetLocalDescription(
+      DummySetSessionDescriptionObserver::Create(), desc);
+
+  std::string sdp;
+  desc->ToString(&sdp);
+
+  // For loopback test. To save some connecting delay.
+  if (loopback_) {
+    // Replace message type from "offer" to "answer"
+    webrtc::SessionDescriptionInterface* session_description(
+        webrtc::CreateSessionDescription("answer", sdp, nullptr));
+    peer_connection_->SetRemoteDescription(
+        DummySetSessionDescriptionObserver::Create(), session_description);
+    return;
+  }
+
+  Json::StyledWriter writer;
+  Json::Value jmessage;
+  jmessage[kSessionDescriptionTypeName] = desc->type();
+  jmessage[kSessionDescriptionSdpName] = sdp;
+  SendMessage(writer.write(jmessage));
+}
+
+void Conductor::OnFailure(const std::string& error) {
+    LOG(LERROR) << error;
+}
+
+void Conductor::SendMessage(const std::string& json_object) {
+  std::string* msg = new std::string(json_object);
+  main_wnd_->QueueUIThreadCallback(SEND_MESSAGE_TO_PEER, msg);
+}
diff --git a/examples/peerconnection/client/conductor.h b/examples/peerconnection/client/conductor.h
new file mode 100644
index 0000000..f5f16a3
--- /dev/null
+++ b/examples/peerconnection/client/conductor.h
@@ -0,0 +1,129 @@
+/*
+ *  Copyright 2012 The WebRTC Project Authors. All rights reserved.
+ *
+ *  Use of this source code is governed by a BSD-style license
+ *  that can be found in the LICENSE file in the root of the source
+ *  tree. An additional intellectual property rights grant can be found
+ *  in the file PATENTS.  All contributing project authors may
+ *  be found in the AUTHORS file in the root of the source tree.
+ */
+
+#ifndef TALK_EXAMPLES_PEERCONNECTION_CLIENT_CONDUCTOR_H_
+#define TALK_EXAMPLES_PEERCONNECTION_CLIENT_CONDUCTOR_H_
+#pragma once
+
+#include <deque>
+#include <map>
+#include <set>
+#include <string>
+
+#include "talk/app/webrtc/mediastreaminterface.h"
+#include "talk/app/webrtc/peerconnectioninterface.h"
+#include "webrtc/examples/peerconnection/client/main_wnd.h"
+#include "webrtc/examples/peerconnection/client/peer_connection_client.h"
+#include "webrtc/base/scoped_ptr.h"
+
+namespace webrtc {
+class VideoCaptureModule;
+}  // namespace webrtc
+
+namespace cricket {
+class VideoRenderer;
+}  // namespace cricket
+
+class Conductor
+  : public webrtc::PeerConnectionObserver,
+    public webrtc::CreateSessionDescriptionObserver,
+    public PeerConnectionClientObserver,
+    public MainWndCallback {
+ public:
+  enum CallbackID {
+    MEDIA_CHANNELS_INITIALIZED = 1,
+    PEER_CONNECTION_CLOSED,
+    SEND_MESSAGE_TO_PEER,
+    NEW_STREAM_ADDED,
+    STREAM_REMOVED,
+  };
+
+  Conductor(PeerConnectionClient* client, MainWindow* main_wnd);
+
+  bool connection_active() const;
+
+  virtual void Close();
+
+ protected:
+  ~Conductor();
+  bool InitializePeerConnection();
+  bool ReinitializePeerConnectionForLoopback();
+  bool CreatePeerConnection(bool dtls);
+  void DeletePeerConnection();
+  void EnsureStreamingUI();
+  void AddStreams();
+  cricket::VideoCapturer* OpenVideoCaptureDevice();
+
+  //
+  // PeerConnectionObserver implementation.
+  //
+  virtual void OnStateChange(
+      webrtc::PeerConnectionObserver::StateType state_changed) {}
+  virtual void OnAddStream(webrtc::MediaStreamInterface* stream);
+  virtual void OnRemoveStream(webrtc::MediaStreamInterface* stream);
+  virtual void OnDataChannel(webrtc::DataChannelInterface* channel) {}
+  virtual void OnRenegotiationNeeded() {}
+  virtual void OnIceChange() {}
+  virtual void OnIceCandidate(const webrtc::IceCandidateInterface* candidate);
+
+  //
+  // PeerConnectionClientObserver implementation.
+  //
+
+  virtual void OnSignedIn();
+
+  virtual void OnDisconnected();
+
+  virtual void OnPeerConnected(int id, const std::string& name);
+
+  virtual void OnPeerDisconnected(int id);
+
+  virtual void OnMessageFromPeer(int peer_id, const std::string& message);
+
+  virtual void OnMessageSent(int err);
+
+  virtual void OnServerConnectionFailure();
+
+  //
+  // MainWndCallback implementation.
+  //
+
+  virtual void StartLogin(const std::string& server, int port);
+
+  virtual void DisconnectFromServer();
+
+  virtual void ConnectToPeer(int peer_id);
+
+  virtual void DisconnectFromCurrentPeer();
+
+  virtual void UIThreadCallback(int msg_id, void* data);
+
+  // CreateSessionDescriptionObserver implementation.
+  virtual void OnSuccess(webrtc::SessionDescriptionInterface* desc);
+  virtual void OnFailure(const std::string& error);
+
+ protected:
+  // Send a message to the remote peer.
+  void SendMessage(const std::string& json_object);
+
+  int peer_id_;
+  bool loopback_;
+  rtc::scoped_refptr<webrtc::PeerConnectionInterface> peer_connection_;
+  rtc::scoped_refptr<webrtc::PeerConnectionFactoryInterface>
+      peer_connection_factory_;
+  PeerConnectionClient* client_;
+  MainWindow* main_wnd_;
+  std::deque<std::string*> pending_messages_;
+  std::map<std::string, rtc::scoped_refptr<webrtc::MediaStreamInterface> >
+      active_streams_;
+  std::string server_;
+};
+
+#endif  // TALK_EXAMPLES_PEERCONNECTION_CLIENT_CONDUCTOR_H_
diff --git a/examples/peerconnection/client/defaults.cc b/examples/peerconnection/client/defaults.cc
new file mode 100644
index 0000000..b686cd7
--- /dev/null
+++ b/examples/peerconnection/client/defaults.cc
@@ -0,0 +1,58 @@
+/*
+ *  Copyright 2012 The WebRTC Project Authors. All rights reserved.
+ *
+ *  Use of this source code is governed by a BSD-style license
+ *  that can be found in the LICENSE file in the root of the source
+ *  tree. An additional intellectual property rights grant can be found
+ *  in the file PATENTS.  All contributing project authors may
+ *  be found in the AUTHORS file in the root of the source tree.
+ */
+
+#include "webrtc/examples/peerconnection/client/defaults.h"
+
+#include <stdlib.h>
+#include <string.h>
+
+#ifdef WIN32
+#include <winsock2.h>
+#else
+#include <unistd.h>
+#endif
+
+#include "webrtc/base/common.h"
+
+const char kAudioLabel[] = "audio_label";
+const char kVideoLabel[] = "video_label";
+const char kStreamLabel[] = "stream_label";
+const uint16 kDefaultServerPort = 8888;
+
+std::string GetEnvVarOrDefault(const char* env_var_name,
+                               const char* default_value) {
+  std::string value;
+  const char* env_var = getenv(env_var_name);
+  if (env_var)
+    value = env_var;
+
+  if (value.empty())
+    value = default_value;
+
+  return value;
+}
+
+std::string GetPeerConnectionString() {
+  return GetEnvVarOrDefault("WEBRTC_CONNECT", "stun:stun.l.google.com:19302");
+}
+
+std::string GetDefaultServerName() {
+  return GetEnvVarOrDefault("WEBRTC_SERVER", "localhost");
+}
+
+std::string GetPeerName() {
+  char computer_name[256];
+  if (gethostname(computer_name, ARRAY_SIZE(computer_name)) != 0)
+    strcpy(computer_name, "host");
+  std::string ret(GetEnvVarOrDefault("USERNAME", "user"));
+  ret += '@';
+  ret += computer_name;
+  return ret;
+}
diff --git a/examples/peerconnection/client/defaults.h b/examples/peerconnection/client/defaults.h
new file mode 100644
index 0000000..ab8276b
--- /dev/null
+++ b/examples/peerconnection/client/defaults.h
@@ -0,0 +1,30 @@
+/*
+ *  Copyright 2011 The WebRTC Project Authors. All rights reserved.
+ *
+ *  Use of this source code is governed by a BSD-style license
+ *  that can be found in the LICENSE file in the root of the source
+ *  tree. An additional intellectual property rights grant can be found
+ *  in the file PATENTS.  All contributing project authors may
+ *  be found in the AUTHORS file in the root of the source tree.
+ */
+
+#ifndef PEERCONNECTION_SAMPLES_CLIENT_DEFAULTS_H_
+#define PEERCONNECTION_SAMPLES_CLIENT_DEFAULTS_H_
+#pragma once
+
+#include <string>
+
+#include "webrtc/base/basictypes.h"
+
+extern const char kAudioLabel[];
+extern const char kVideoLabel[];
+extern const char kStreamLabel[];
+extern const uint16 kDefaultServerPort;
+
+std::string GetEnvVarOrDefault(const char* env_var_name,
+                               const char* default_value);
+std::string GetPeerConnectionString();
+std::string GetDefaultServerName();
+std::string GetPeerName();
+
+#endif  // PEERCONNECTION_SAMPLES_CLIENT_DEFAULTS_H_
diff --git a/examples/peerconnection/client/flagdefs.h b/examples/peerconnection/client/flagdefs.h
new file mode 100644
index 0000000..00e134d
--- /dev/null
+++ b/examples/peerconnection/client/flagdefs.h
@@ -0,0 +1,33 @@
+/*
+ *  Copyright 2012 The WebRTC Project Authors. All rights reserved.
+ *
+ *  Use of this source code is governed by a BSD-style license
+ *  that can be found in the LICENSE file in the root of the source
+ *  tree. An additional intellectual property rights grant can be found
+ *  in the file PATENTS.  All contributing project authors may
+ *  be found in the AUTHORS file in the root of the source tree.
+ */
+
+#ifndef TALK_EXAMPLES_PEERCONNECTION_CLIENT_FLAGDEFS_H_
+#define TALK_EXAMPLES_PEERCONNECTION_CLIENT_FLAGDEFS_H_
+#pragma once
+
+#include "webrtc/base/flags.h"
+
+extern const uint16 kDefaultServerPort;  // From defaults.[h|cc]
+
+// Define flags for the peerconnect_client testing tool, in a separate
+// header file so that they can be shared across the different main.cc's
+// for each platform.
+
+DEFINE_bool(help, false, "Prints this message");
+DEFINE_bool(autoconnect, false, "Connect to the server without user "
+                                "intervention.");
+DEFINE_string(server, "localhost", "The server to connect to.");
+DEFINE_int(port, kDefaultServerPort,
+           "The port on which the server is listening.");
+DEFINE_bool(autocall, false, "Call the first available other client on "
+  "the server without user intervention.  Note: this flag should only be set "
+  "to true on one of the two clients.");
+
+#endif  // TALK_EXAMPLES_PEERCONNECTION_CLIENT_FLAGDEFS_H_
diff --git a/examples/peerconnection/client/linux/main.cc b/examples/peerconnection/client/linux/main.cc
new file mode 100644
index 0000000..cf88c36
--- /dev/null
+++ b/examples/peerconnection/client/linux/main.cc
@@ -0,0 +1,105 @@
+/*
+ *  Copyright 2012 The WebRTC Project Authors. All rights reserved.
+ *
+ *  Use of this source code is governed by a BSD-style license
+ *  that can be found in the LICENSE file in the root of the source
+ *  tree. An additional intellectual property rights grant can be found
+ *  in the file PATENTS.  All contributing project authors may
+ *  be found in the AUTHORS file in the root of the source tree.
+ */
+
+#include <gtk/gtk.h>
+
+#include "webrtc/examples/peerconnection/client/conductor.h"
+#include "webrtc/examples/peerconnection/client/flagdefs.h"
+#include "webrtc/examples/peerconnection/client/linux/main_wnd.h"
+#include "webrtc/examples/peerconnection/client/peer_connection_client.h"
+
+#include "webrtc/base/ssladapter.h"
+#include "webrtc/base/thread.h"
+
+class CustomSocketServer : public rtc::PhysicalSocketServer {
+ public:
+  CustomSocketServer(rtc::Thread* thread, GtkMainWnd* wnd)
+      : thread_(thread), wnd_(wnd), conductor_(NULL), client_(NULL) {}
+  virtual ~CustomSocketServer() {}
+
+  void set_client(PeerConnectionClient* client) { client_ = client; }
+  void set_conductor(Conductor* conductor) { conductor_ = conductor; }
+
+  // Override so that we can also pump the GTK message loop.
+  virtual bool Wait(int cms, bool process_io) {
+    // Pump GTK events.
+    // TODO: We really should move either the socket server or UI to a
+    // different thread.  Alternatively we could look at merging the two loops
+    // by implementing a dispatcher for the socket server and/or use
+    // g_main_context_set_poll_func.
+      while (gtk_events_pending())
+        gtk_main_iteration();
+
+    if (!wnd_->IsWindow() && !conductor_->connection_active() &&
+        client_ != NULL && !client_->is_connected()) {
+      thread_->Quit();
+    }
+    return rtc::PhysicalSocketServer::Wait(0/*cms == -1 ? 1 : cms*/,
+                                                 process_io);
+  }
+
+ protected:
+  rtc::Thread* thread_;
+  GtkMainWnd* wnd_;
+  Conductor* conductor_;
+  PeerConnectionClient* client_;
+};
+
+int main(int argc, char* argv[]) {
+  gtk_init(&argc, &argv);
+  g_type_init();
+  // g_thread_init API is deprecated since glib 2.31.0, see release note:
+  // http://mail.gnome.org/archives/gnome-announce-list/2011-October/msg00041.html
+#if !GLIB_CHECK_VERSION(2, 31, 0)
+    g_thread_init(NULL);
+#endif
+
+  rtc::FlagList::SetFlagsFromCommandLine(&argc, argv, true);
+  if (FLAG_help) {
+    rtc::FlagList::Print(NULL, false);
+    return 0;
+  }
+
+  // Abort if the user specifies a port that is outside the allowed
+  // range [1, 65535].
+  if ((FLAG_port < 1) || (FLAG_port > 65535)) {
+    printf("Error: %i is not a valid port.\n", FLAG_port);
+    return -1;
+  }
+
+  GtkMainWnd wnd(FLAG_server, FLAG_port, FLAG_autoconnect, FLAG_autocall);
+  wnd.Create();
+
+  rtc::AutoThread auto_thread;
+  rtc::Thread* thread = rtc::Thread::Current();
+  CustomSocketServer socket_server(thread, &wnd);
+  thread->set_socketserver(&socket_server);
+
+  rtc::InitializeSSL();
+  // Must be constructed after we set the socketserver.
+  PeerConnectionClient client;
+  rtc::scoped_refptr<Conductor> conductor(
+      new rtc::RefCountedObject<Conductor>(&client, &wnd));
+  socket_server.set_client(&client);
+  socket_server.set_conductor(conductor);
+
+  thread->Run();
+
+  // gtk_main();
+  wnd.Destroy();
+
+  thread->set_socketserver(NULL);
+  // TODO: Run the Gtk main loop to tear down the connection.
+  //while (gtk_events_pending()) {
+  //  gtk_main_iteration();
+  //}
+  rtc::CleanupSSL();
+  return 0;
+}
diff --git a/examples/peerconnection/client/linux/main_wnd.cc b/examples/peerconnection/client/linux/main_wnd.cc
new file mode 100644
index 0000000..02b6e32
--- /dev/null
+++ b/examples/peerconnection/client/linux/main_wnd.cc
@@ -0,0 +1,513 @@
+/*
+ *  Copyright 2012 The WebRTC Project Authors. All rights reserved.
+ *
+ *  Use of this source code is governed by a BSD-style license
+ *  that can be found in the LICENSE file in the root of the source
+ *  tree. An additional intellectual property rights grant can be found
+ *  in the file PATENTS.  All contributing project authors may
+ *  be found in the AUTHORS file in the root of the source tree.
+ */
+
+#include "webrtc/examples/peerconnection/client/linux/main_wnd.h"
+
+#include <gdk/gdkkeysyms.h>
+#include <gtk/gtk.h>
+#include <stddef.h>
+
+#include "webrtc/examples/peerconnection/client/defaults.h"
+#include "webrtc/base/common.h"
+#include "webrtc/base/logging.h"
+#include "webrtc/base/stringutils.h"
+
+using rtc::sprintfn;
+
+namespace {
+
+//
+// Simple static functions that simply forward the callback to the
+// GtkMainWnd instance.
+//
+
+gboolean OnDestroyedCallback(GtkWidget* widget, GdkEvent* event,
+                             gpointer data) {
+  reinterpret_cast<GtkMainWnd*>(data)->OnDestroyed(widget, event);
+  return FALSE;
+}
+
+void OnClickedCallback(GtkWidget* widget, gpointer data) {
+  reinterpret_cast<GtkMainWnd*>(data)->OnClicked(widget);
+}
+
+gboolean SimulateButtonClick(gpointer button) {
+  g_signal_emit_by_name(button, "clicked");
+  return false;
+}
+
+gboolean OnKeyPressCallback(GtkWidget* widget, GdkEventKey* key,
+                            gpointer data) {
+  reinterpret_cast<GtkMainWnd*>(data)->OnKeyPress(widget, key);
+  return false;
+}
+
+void OnRowActivatedCallback(GtkTreeView* tree_view, GtkTreePath* path,
+                            GtkTreeViewColumn* column, gpointer data) {
+  reinterpret_cast<GtkMainWnd*>(data)->OnRowActivated(tree_view, path, column);
+}
+
+gboolean SimulateLastRowActivated(gpointer data) {
+  GtkTreeView* tree_view = reinterpret_cast<GtkTreeView*>(data);
+  GtkTreeModel* model = gtk_tree_view_get_model(tree_view);
+
+  // "if iter is NULL, then the number of toplevel nodes is returned."
+  int rows = gtk_tree_model_iter_n_children(model, NULL);
+  GtkTreePath* lastpath = gtk_tree_path_new_from_indices(rows - 1, -1);
+
+  // Select the last item in the list
+  GtkTreeSelection* selection = gtk_tree_view_get_selection(tree_view);
+  gtk_tree_selection_select_path(selection, lastpath);
+
+  // Our TreeView only has one column, so it is column 0.
+  GtkTreeViewColumn* column = gtk_tree_view_get_column(tree_view, 0);
+
+  gtk_tree_view_row_activated(tree_view, lastpath, column);
+
+  gtk_tree_path_free(lastpath);
+  return false;
+}
+
+// Creates a tree view, that we use to display the list of peers.
+void InitializeList(GtkWidget* list) {
+  GtkCellRenderer* renderer = gtk_cell_renderer_text_new();
+  GtkTreeViewColumn* column = gtk_tree_view_column_new_with_attributes(
+      "List Items", renderer, "text", 0, NULL);
+  gtk_tree_view_append_column(GTK_TREE_VIEW(list), column);
+  GtkListStore* store = gtk_list_store_new(2, G_TYPE_STRING, G_TYPE_INT);
+  gtk_tree_view_set_model(GTK_TREE_VIEW(list), GTK_TREE_MODEL(store));
+  g_object_unref(store);
+}
+
+// Adds an entry to a tree view.
+void AddToList(GtkWidget* list, const gchar* str, int value) {
+  GtkListStore* store = GTK_LIST_STORE(
+      gtk_tree_view_get_model(GTK_TREE_VIEW(list)));
+
+  GtkTreeIter iter;
+  gtk_list_store_append(store, &iter);
+  gtk_list_store_set(store, &iter, 0, str, 1, value, -1);
+}
+
+struct UIThreadCallbackData {
+  explicit UIThreadCallbackData(MainWndCallback* cb, int id, void* d)
+      : callback(cb), msg_id(id), data(d) {}
+  MainWndCallback* callback;
+  int msg_id;
+  void* data;
+};
+
+gboolean HandleUIThreadCallback(gpointer data) {
+  UIThreadCallbackData* cb_data = reinterpret_cast<UIThreadCallbackData*>(data);
+  cb_data->callback->UIThreadCallback(cb_data->msg_id, cb_data->data);
+  delete cb_data;
+  return false;
+}
+
+gboolean Redraw(gpointer data) {
+  GtkMainWnd* wnd = reinterpret_cast<GtkMainWnd*>(data);
+  wnd->OnRedraw();
+  return false;
+}
+}  // end anonymous
+
+//
+// GtkMainWnd implementation.
+//
+
+GtkMainWnd::GtkMainWnd(const char* server, int port, bool autoconnect,
+                       bool autocall)
+    : window_(NULL), draw_area_(NULL), vbox_(NULL), server_edit_(NULL),
+      port_edit_(NULL), peer_list_(NULL), callback_(NULL),
+      server_(server), autoconnect_(autoconnect), autocall_(autocall) {
+  char buffer[10];
+  sprintfn(buffer, sizeof(buffer), "%i", port);
+  port_ = buffer;
+}
+
+GtkMainWnd::~GtkMainWnd() {
+  ASSERT(!IsWindow());
+}
+
+void GtkMainWnd::RegisterObserver(MainWndCallback* callback) {
+  callback_ = callback;
+}
+
+bool GtkMainWnd::IsWindow() {
+  return window_ != NULL && GTK_IS_WINDOW(window_);
+}
+
+void GtkMainWnd::MessageBox(const char* caption, const char* text,
+                            bool is_error) {
+  GtkWidget* dialog = gtk_message_dialog_new(GTK_WINDOW(window_),
+      GTK_DIALOG_DESTROY_WITH_PARENT,
+      is_error ? GTK_MESSAGE_ERROR : GTK_MESSAGE_INFO,
+      GTK_BUTTONS_CLOSE, "%s", text);
+  gtk_window_set_title(GTK_WINDOW(dialog), caption);
+  gtk_dialog_run(GTK_DIALOG(dialog));
+  gtk_widget_destroy(dialog);
+}
+
+MainWindow::UI GtkMainWnd::current_ui() {
+  if (vbox_)
+    return CONNECT_TO_SERVER;
+
+  if (peer_list_)
+    return LIST_PEERS;
+
+  return STREAMING;
+}
+
+
+void GtkMainWnd::StartLocalRenderer(webrtc::VideoTrackInterface* local_video) {
+  local_renderer_.reset(new VideoRenderer(this, local_video));
+}
+
+void GtkMainWnd::StopLocalRenderer() {
+  local_renderer_.reset();
+}
+
+void GtkMainWnd::StartRemoteRenderer(webrtc::VideoTrackInterface* remote_video) {
+  remote_renderer_.reset(new VideoRenderer(this, remote_video));
+}
+
+void GtkMainWnd::StopRemoteRenderer() {
+  remote_renderer_.reset();
+}
+
+void GtkMainWnd::QueueUIThreadCallback(int msg_id, void* data) {
+  g_idle_add(HandleUIThreadCallback,
+             new UIThreadCallbackData(callback_, msg_id, data));
+}
+
+bool GtkMainWnd::Create() {
+  ASSERT(window_ == NULL);
+
+  window_ = gtk_window_new(GTK_WINDOW_TOPLEVEL);
+  if (window_) {
+    gtk_window_set_position(GTK_WINDOW(window_), GTK_WIN_POS_CENTER);
+    gtk_window_set_default_size(GTK_WINDOW(window_), 640, 480);
+    gtk_window_set_title(GTK_WINDOW(window_), "PeerConnection client");
+    g_signal_connect(G_OBJECT(window_), "delete-event",
+                     G_CALLBACK(&OnDestroyedCallback), this);
+    g_signal_connect(window_, "key-press-event", G_CALLBACK(OnKeyPressCallback),
+                     this);
+
+    SwitchToConnectUI();
+  }
+
+  return window_ != NULL;
+}
+
+bool GtkMainWnd::Destroy() {
+  if (!IsWindow())
+    return false;
+
+  gtk_widget_destroy(window_);
+  window_ = NULL;
+
+  return true;
+}
+
+void GtkMainWnd::SwitchToConnectUI() {
+  LOG(INFO) << __FUNCTION__;
+
+  ASSERT(IsWindow());
+  ASSERT(vbox_ == NULL);
+
+  gtk_container_set_border_width(GTK_CONTAINER(window_), 10);
+
+  if (peer_list_) {
+    gtk_widget_destroy(peer_list_);
+    peer_list_ = NULL;
+  }
+
+  vbox_ = gtk_vbox_new(FALSE, 5);
+  GtkWidget* valign = gtk_alignment_new(0, 1, 0, 0);
+  gtk_container_add(GTK_CONTAINER(vbox_), valign);
+  gtk_container_add(GTK_CONTAINER(window_), vbox_);
+
+  GtkWidget* hbox = gtk_hbox_new(FALSE, 5);
+
+  GtkWidget* label = gtk_label_new("Server");
+  gtk_container_add(GTK_CONTAINER(hbox), label);
+
+  server_edit_ = gtk_entry_new();
+  gtk_entry_set_text(GTK_ENTRY(server_edit_), server_.c_str());
+  gtk_widget_set_size_request(server_edit_, 400, 30);
+  gtk_container_add(GTK_CONTAINER(hbox), server_edit_);
+
+  port_edit_ = gtk_entry_new();
+  gtk_entry_set_text(GTK_ENTRY(port_edit_), port_.c_str());
+  gtk_widget_set_size_request(port_edit_, 70, 30);
+  gtk_container_add(GTK_CONTAINER(hbox), port_edit_);
+
+  GtkWidget* button = gtk_button_new_with_label("Connect");
+  gtk_widget_set_size_request(button, 70, 30);
+  g_signal_connect(button, "clicked", G_CALLBACK(OnClickedCallback), this);
+  gtk_container_add(GTK_CONTAINER(hbox), button);
+
+  GtkWidget* halign = gtk_alignment_new(1, 0, 0, 0);
+  gtk_container_add(GTK_CONTAINER(halign), hbox);
+  gtk_box_pack_start(GTK_BOX(vbox_), halign, FALSE, FALSE, 0);
+
+  gtk_widget_show_all(window_);
+
+  if (autoconnect_)
+    g_idle_add(SimulateButtonClick, button);
+}
+
+void GtkMainWnd::SwitchToPeerList(const Peers& peers) {
+  LOG(INFO) << __FUNCTION__;
+
+  if (!peer_list_) {
+    gtk_container_set_border_width(GTK_CONTAINER(window_), 0);
+    if (vbox_) {
+      gtk_widget_destroy(vbox_);
+      vbox_ = NULL;
+      server_edit_ = NULL;
+      port_edit_ = NULL;
+    } else if (draw_area_) {
+      gtk_widget_destroy(draw_area_);
+      draw_area_ = NULL;
+      draw_buffer_.reset();
+    }
+
+    peer_list_ = gtk_tree_view_new();
+    g_signal_connect(peer_list_, "row-activated",
+                     G_CALLBACK(OnRowActivatedCallback), this);
+    gtk_tree_view_set_headers_visible(GTK_TREE_VIEW(peer_list_), FALSE);
+    InitializeList(peer_list_);
+    gtk_container_add(GTK_CONTAINER(window_), peer_list_);
+    gtk_widget_show_all(window_);
+  } else {
+    GtkListStore* store =
+        GTK_LIST_STORE(gtk_tree_view_get_model(GTK_TREE_VIEW(peer_list_)));
+    gtk_list_store_clear(store);
+  }
+
+  AddToList(peer_list_, "List of currently connected peers:", -1);
+  for (Peers::const_iterator i = peers.begin(); i != peers.end(); ++i)
+    AddToList(peer_list_, i->second.c_str(), i->first);
+
+  if (autocall_ && peers.begin() != peers.end())
+    g_idle_add(SimulateLastRowActivated, peer_list_);
+}
+
+void GtkMainWnd::SwitchToStreamingUI() {
+  LOG(INFO) << __FUNCTION__;
+
+  ASSERT(draw_area_ == NULL);
+
+  gtk_container_set_border_width(GTK_CONTAINER(window_), 0);
+  if (peer_list_) {
+    gtk_widget_destroy(peer_list_);
+    peer_list_ = NULL;
+  }
+
+  draw_area_ = gtk_drawing_area_new();
+  gtk_container_add(GTK_CONTAINER(window_), draw_area_);
+
+  gtk_widget_show_all(window_);
+}
+
+void GtkMainWnd::OnDestroyed(GtkWidget* widget, GdkEvent* event) {
+  callback_->Close();
+  window_ = NULL;
+  draw_area_ = NULL;
+  vbox_ = NULL;
+  server_edit_ = NULL;
+  port_edit_ = NULL;
+  peer_list_ = NULL;
+}
+
+void GtkMainWnd::OnClicked(GtkWidget* widget) {
+  // Make the connect button insensitive, so that it cannot be clicked more than
+  // once.  Now that the connection includes auto-retry, it should not be
+  // necessary to click it more than once.
+  gtk_widget_set_sensitive(widget, false);
+  server_ = gtk_entry_get_text(GTK_ENTRY(server_edit_));
+  port_ = gtk_entry_get_text(GTK_ENTRY(port_edit_));
+  int port = port_.length() ? atoi(port_.c_str()) : 0;
+  callback_->StartLogin(server_, port);
+}
+
+void GtkMainWnd::OnKeyPress(GtkWidget* widget, GdkEventKey* key) {
+  if (key->type == GDK_KEY_PRESS) {
+    switch (key->keyval) {
+     case GDK_Escape:
+       if (draw_area_) {
+         callback_->DisconnectFromCurrentPeer();
+       } else if (peer_list_) {
+         callback_->DisconnectFromServer();
+       }
+       break;
+
+     case GDK_KP_Enter:
+     case GDK_Return:
+       if (vbox_) {
+         OnClicked(NULL);
+       } else if (peer_list_) {
+         // OnRowActivated will be called automatically when the user
+         // presses enter.
+       }
+       break;
+
+     default:
+       break;
+    }
+  }
+}
+
+void GtkMainWnd::OnRowActivated(GtkTreeView* tree_view, GtkTreePath* path,
+                                GtkTreeViewColumn* column) {
+  ASSERT(peer_list_ != NULL);
+  GtkTreeIter iter;
+  GtkTreeModel* model;
+  GtkTreeSelection* selection =
+      gtk_tree_view_get_selection(GTK_TREE_VIEW(tree_view));
+  if (gtk_tree_selection_get_selected(selection, &model, &iter)) {
+     char* text;
+     int id = -1;
+     gtk_tree_model_get(model, &iter, 0, &text, 1, &id,  -1);
+     if (id != -1)
+       callback_->ConnectToPeer(id);
+     g_free(text);
+  }
+}
+
+void GtkMainWnd::OnRedraw() {
+  gdk_threads_enter();
+
+  VideoRenderer* remote_renderer = remote_renderer_.get();
+  if (remote_renderer && remote_renderer->image() != NULL &&
+      draw_area_ != NULL) {
+    int width = remote_renderer->width();
+    int height = remote_renderer->height();
+
+    if (!draw_buffer_.get()) {
+      draw_buffer_size_ = (width * height * 4) * 4;
+      draw_buffer_.reset(new uint8[draw_buffer_size_]);
+      gtk_widget_set_size_request(draw_area_, width * 2, height * 2);
+    }
+
+    const uint32* image = reinterpret_cast<const uint32*>(
+        remote_renderer->image());
+    uint32* scaled = reinterpret_cast<uint32*>(draw_buffer_.get());
+    for (int r = 0; r < height; ++r) {
+      for (int c = 0; c < width; ++c) {
+        int x = c * 2;
+        scaled[x] = scaled[x + 1] = image[c];
+      }
+
+      uint32* prev_line = scaled;
+      scaled += width * 2;
+      memcpy(scaled, prev_line, (width * 2) * 4);
+
+      image += width;
+      scaled += width * 2;
+    }
+
+    VideoRenderer* local_renderer = local_renderer_.get();
+    if (local_renderer && local_renderer->image()) {
+      image = reinterpret_cast<const uint32*>(local_renderer->image());
+      scaled = reinterpret_cast<uint32*>(draw_buffer_.get());
+      // Position the local preview on the right side.
+      scaled += (width * 2) - (local_renderer->width() / 2);
+      // right margin...
+      scaled -= 10;
+      // ... towards the bottom.
+      scaled += (height * width * 4) -
+                ((local_renderer->height() / 2) *
+                 (local_renderer->width() / 2) * 4);
+      // bottom margin...
+      scaled -= (width * 2) * 5;
+      for (int r = 0; r < local_renderer->height(); r += 2) {
+        for (int c = 0; c < local_renderer->width(); c += 2) {
+          scaled[c / 2] = image[c + r * local_renderer->width()];
+        }
+        scaled += width * 2;
+      }
+    }
+
+    gdk_draw_rgb_32_image(draw_area_->window,
+                          draw_area_->style->fg_gc[GTK_STATE_NORMAL],
+                          0,
+                          0,
+                          width * 2,
+                          height * 2,
+                          GDK_RGB_DITHER_MAX,
+                          draw_buffer_.get(),
+                          (width * 2) * 4);
+  }
+
+  gdk_threads_leave();
+}
+
+GtkMainWnd::VideoRenderer::VideoRenderer(
+    GtkMainWnd* main_wnd,
+    webrtc::VideoTrackInterface* track_to_render)
+    : width_(0),
+      height_(0),
+      main_wnd_(main_wnd),
+      rendered_track_(track_to_render) {
+  rendered_track_->AddRenderer(this);
+}
+
+GtkMainWnd::VideoRenderer::~VideoRenderer() {
+  rendered_track_->RemoveRenderer(this);
+}
+
+void GtkMainWnd::VideoRenderer::SetSize(int width, int height) {
+  gdk_threads_enter();
+
+  if (width_ == width && height_ == height) {
+    return;
+  }
+
+  width_ = width;
+  height_ = height;
+  image_.reset(new uint8[width * height * 4]);
+  gdk_threads_leave();
+}
+
+void GtkMainWnd::VideoRenderer::RenderFrame(
+    const cricket::VideoFrame* video_frame) {
+  gdk_threads_enter();
+
+  const cricket::VideoFrame* frame = video_frame->GetCopyWithRotationApplied();
+
+  SetSize(static_cast<int>(frame->GetWidth()),
+          static_cast<int>(frame->GetHeight()));
+
+  int size = width_ * height_ * 4;
+  // TODO: Convert directly to RGBA
+  frame->ConvertToRgbBuffer(cricket::FOURCC_ARGB,
+                            image_.get(),
+                            size,
+                            width_ * 4);
+  // Convert the B,G,R,A frame to R,G,B,A, which is accepted by GTK.
+  // The 'A' is just padding for GTK, so we can use it as temp.
+  uint8* pix = image_.get();
+  uint8* end = image_.get() + size;
+  while (pix < end) {
+    pix[3] = pix[0];     // Save B to A.
+    pix[0] = pix[2];  // Set Red.
+    pix[2] = pix[3];  // Set Blue.
+    pix[3] = 0xFF;     // Fixed Alpha.
+    pix += 4;
+  }
+
+  gdk_threads_leave();
+
+  g_idle_add(Redraw, main_wnd_);
+}
+
+
diff --git a/examples/peerconnection/client/linux/main_wnd.h b/examples/peerconnection/client/linux/main_wnd.h
new file mode 100644
index 0000000..cfb2376
--- /dev/null
+++ b/examples/peerconnection/client/linux/main_wnd.h
@@ -0,0 +1,120 @@
+/*
+ *  Copyright 2012 The WebRTC Project Authors. All rights reserved.
+ *
+ *  Use of this source code is governed by a BSD-style license
+ *  that can be found in the LICENSE file in the root of the source
+ *  tree. An additional intellectual property rights grant can be found
+ *  in the file PATENTS.  All contributing project authors may
+ *  be found in the AUTHORS file in the root of the source tree.
+ */
+
+#ifndef PEERCONNECTION_SAMPLES_CLIENT_LINUX_MAIN_WND_H_
+#define PEERCONNECTION_SAMPLES_CLIENT_LINUX_MAIN_WND_H_
+
+#include "webrtc/examples/peerconnection/client/main_wnd.h"
+#include "webrtc/examples/peerconnection/client/peer_connection_client.h"
+
+// Forward declarations.
+typedef struct _GtkWidget GtkWidget;
+typedef union _GdkEvent GdkEvent;
+typedef struct _GdkEventKey GdkEventKey;
+typedef struct _GtkTreeView GtkTreeView;
+typedef struct _GtkTreePath GtkTreePath;
+typedef struct _GtkTreeViewColumn GtkTreeViewColumn;
+
+// Implements the main UI of the peer connection client.
+// This is functionally equivalent to the MainWnd class in the Windows
+// implementation.
+class GtkMainWnd : public MainWindow {
+ public:
+  GtkMainWnd(const char* server, int port, bool autoconnect, bool autocall);
+  ~GtkMainWnd();
+
+  virtual void RegisterObserver(MainWndCallback* callback);
+  virtual bool IsWindow();
+  virtual void SwitchToConnectUI();
+  virtual void SwitchToPeerList(const Peers& peers);
+  virtual void SwitchToStreamingUI();
+  virtual void MessageBox(const char* caption, const char* text,
+                          bool is_error);
+  virtual MainWindow::UI current_ui();
+  virtual void StartLocalRenderer(webrtc::VideoTrackInterface* local_video);
+  virtual void StopLocalRenderer();
+  virtual void StartRemoteRenderer(webrtc::VideoTrackInterface* remote_video);
+  virtual void StopRemoteRenderer();
+
+  virtual void QueueUIThreadCallback(int msg_id, void* data);
+
+  // Creates and shows the main window with the |Connect UI| enabled.
+  bool Create();
+
+  // Destroys the window.  When the window is destroyed, it ends the
+  // main message loop.
+  bool Destroy();
+
+  // Callback for when the main window is destroyed.
+  void OnDestroyed(GtkWidget* widget, GdkEvent* event);
+
+  // Callback for when the user clicks the "Connect" button.
+  void OnClicked(GtkWidget* widget);
+
+  // Callback for keystrokes.  Used to capture Esc and Return.
+  void OnKeyPress(GtkWidget* widget, GdkEventKey* key);
+
+  // Callback when the user double clicks a peer in order to initiate a
+  // connection.
+  void OnRowActivated(GtkTreeView* tree_view, GtkTreePath* path,
+                      GtkTreeViewColumn* column);
+
+  void OnRedraw();
+
+ protected:
+  class VideoRenderer : public webrtc::VideoRendererInterface {
+   public:
+    VideoRenderer(GtkMainWnd* main_wnd,
+                  webrtc::VideoTrackInterface* track_to_render);
+    virtual ~VideoRenderer();
+
+    // VideoRendererInterface implementation
+    virtual void SetSize(int width, int height);
+    virtual void RenderFrame(const cricket::VideoFrame* frame);
+
+    const uint8* image() const {
+      return image_.get();
+    }
+
+    int width() const {
+      return width_;
+    }
+
+    int height() const {
+      return height_;
+    }
+
+   protected:
+    rtc::scoped_ptr<uint8[]> image_;
+    int width_;
+    int height_;
+    GtkMainWnd* main_wnd_;
+    rtc::scoped_refptr<webrtc::VideoTrackInterface> rendered_track_;
+  };
+
+ protected:
+  GtkWidget* window_;  // Our main window.
+  GtkWidget* draw_area_;  // The drawing surface for rendering video streams.
+  GtkWidget* vbox_;  // Container for the Connect UI.
+  GtkWidget* server_edit_;
+  GtkWidget* port_edit_;
+  GtkWidget* peer_list_;  // The list of peers.
+  MainWndCallback* callback_;
+  std::string server_;
+  std::string port_;
+  bool autoconnect_;
+  bool autocall_;
+  rtc::scoped_ptr<VideoRenderer> local_renderer_;
+  rtc::scoped_ptr<VideoRenderer> remote_renderer_;
+  rtc::scoped_ptr<uint8> draw_buffer_;
+  int draw_buffer_size_;
+};
+
+#endif  // PEERCONNECTION_SAMPLES_CLIENT_LINUX_MAIN_WND_H_
diff --git a/examples/peerconnection/client/main.cc b/examples/peerconnection/client/main.cc
new file mode 100644
index 0000000..9aae684
--- /dev/null
+++ b/examples/peerconnection/client/main.cc
@@ -0,0 +1,76 @@
+/*
+ *  Copyright 2012 The WebRTC Project Authors. All rights reserved.
+ *
+ *  Use of this source code is governed by a BSD-style license
+ *  that can be found in the LICENSE file in the root of the source
+ *  tree. An additional intellectual property rights grant can be found
+ *  in the file PATENTS.  All contributing project authors may
+ *  be found in the AUTHORS file in the root of the source tree.
+ */
+
+#include "webrtc/examples/peerconnection/client/conductor.h"
+#include "webrtc/examples/peerconnection/client/flagdefs.h"
+#include "webrtc/examples/peerconnection/client/main_wnd.h"
+#include "webrtc/examples/peerconnection/client/peer_connection_client.h"
+#include "webrtc/base/ssladapter.h"
+#include "webrtc/base/win32socketinit.h"
+#include "webrtc/base/win32socketserver.h"
+
+
+int PASCAL wWinMain(HINSTANCE instance, HINSTANCE prev_instance,
+                    wchar_t* cmd_line, int cmd_show) {
+  rtc::EnsureWinsockInit();
+  rtc::Win32Thread w32_thread;
+  rtc::ThreadManager::Instance()->SetCurrentThread(&w32_thread);
+
+  rtc::WindowsCommandLineArguments win_args;
+  int argc = win_args.argc();
+  char **argv = win_args.argv();
+
+  rtc::FlagList::SetFlagsFromCommandLine(&argc, argv, true);
+  if (FLAG_help) {
+    rtc::FlagList::Print(NULL, false);
+    return 0;
+  }
+
+  // Abort if the user specifies a port that is outside the allowed
+  // range [1, 65535].
+  if ((FLAG_port < 1) || (FLAG_port > 65535)) {
+    printf("Error: %i is not a valid port.\n", FLAG_port);
+    return -1;
+  }
+
+  MainWnd wnd(FLAG_server, FLAG_port, FLAG_autoconnect, FLAG_autocall);
+  if (!wnd.Create()) {
+    ASSERT(false);
+    return -1;
+  }
+
+  rtc::InitializeSSL();
+  PeerConnectionClient client;
+  rtc::scoped_refptr<Conductor> conductor(
+        new rtc::RefCountedObject<Conductor>(&client, &wnd));
+
+  // Main loop.
+  MSG msg;
+  BOOL gm;
+  while ((gm = ::GetMessage(&msg, NULL, 0, 0)) != 0 && gm != -1) {
+    if (!wnd.PreTranslateMessage(&msg)) {
+      ::TranslateMessage(&msg);
+      ::DispatchMessage(&msg);
+    }
+  }
+
+  if (conductor->connection_active() || client.is_connected()) {
+    while ((conductor->connection_active() || client.is_connected()) &&
+           (gm = ::GetMessage(&msg, NULL, 0, 0)) != 0 && gm != -1) {
+      if (!wnd.PreTranslateMessage(&msg)) {
+        ::TranslateMessage(&msg);
+        ::DispatchMessage(&msg);
+      }
+    }
+  }
+
+  rtc::CleanupSSL();
+  return 0;
+}
diff --git a/examples/peerconnection/client/main_wnd.cc b/examples/peerconnection/client/main_wnd.cc
new file mode 100644
index 0000000..fa356ff
--- /dev/null
+++ b/examples/peerconnection/client/main_wnd.cc
@@ -0,0 +1,622 @@
+/*
+ *  Copyright 2012 The WebRTC Project Authors. All rights reserved.
+ *
+ *  Use of this source code is governed by a BSD-style license
+ *  that can be found in the LICENSE file in the root of the source
+ *  tree. An additional intellectual property rights grant can be found
+ *  in the file PATENTS.  All contributing project authors may
+ *  be found in the AUTHORS file in the root of the source tree.
+ */
+
+#include "webrtc/examples/peerconnection/client/main_wnd.h"
+
+#include <math.h>
+
+#include "webrtc/examples/peerconnection/client/defaults.h"
+#include "webrtc/base/common.h"
+#include "webrtc/base/logging.h"
+
+ATOM MainWnd::wnd_class_ = 0;
+const wchar_t MainWnd::kClassName[] = L"WebRTC_MainWnd";
+
+using rtc::sprintfn;
+
+namespace {
+
+const char kConnecting[] = "Connecting... ";
+const char kNoVideoStreams[] = "(no video streams either way)";
+const char kNoIncomingStream[] = "(no incoming video)";
+
+void CalculateWindowSizeForText(HWND wnd, const wchar_t* text,
+                                size_t* width, size_t* height) {
+  HDC dc = ::GetDC(wnd);
+  RECT text_rc = {0};
+  ::DrawText(dc, text, -1, &text_rc, DT_CALCRECT | DT_SINGLELINE);
+  ::ReleaseDC(wnd, dc);
+  RECT client, window;
+  ::GetClientRect(wnd, &client);
+  ::GetWindowRect(wnd, &window);
+
+  *width = text_rc.right - text_rc.left;
+  *width += (window.right - window.left) -
+            (client.right - client.left);
+  *height = text_rc.bottom - text_rc.top;
+  *height += (window.bottom - window.top) -
+             (client.bottom - client.top);
+}
+
+HFONT GetDefaultFont() {
+  static HFONT font = reinterpret_cast<HFONT>(GetStockObject(DEFAULT_GUI_FONT));
+  return font;
+}
+
+std::string GetWindowText(HWND wnd) {
+  char text[MAX_PATH] = {0};
+  ::GetWindowTextA(wnd, &text[0], ARRAYSIZE(text));
+  return text;
+}
+
+void AddListBoxItem(HWND listbox, const std::string& str, LPARAM item_data) {
+  LRESULT index = ::SendMessageA(listbox, LB_ADDSTRING, 0,
+      reinterpret_cast<LPARAM>(str.c_str()));
+  ::SendMessageA(listbox, LB_SETITEMDATA, index, item_data);
+}
+
+}  // namespace
+
+MainWnd::MainWnd(const char* server, int port, bool auto_connect,
+                 bool auto_call)
+  : ui_(CONNECT_TO_SERVER), wnd_(NULL), edit1_(NULL), edit2_(NULL),
+    label1_(NULL), label2_(NULL), button_(NULL), listbox_(NULL),
+    destroyed_(false), callback_(NULL), nested_msg_(NULL),
+    server_(server), auto_connect_(auto_connect), auto_call_(auto_call) {
+  char buffer[10] = {0};
+  sprintfn(buffer, sizeof(buffer), "%i", port);
+  port_ = buffer;
+}
+
+MainWnd::~MainWnd() {
+  ASSERT(!IsWindow());
+}
+
+bool MainWnd::Create() {
+  ASSERT(wnd_ == NULL);
+  if (!RegisterWindowClass())
+    return false;
+
+  ui_thread_id_ = ::GetCurrentThreadId();
+  wnd_ = ::CreateWindowExW(WS_EX_OVERLAPPEDWINDOW, kClassName, L"WebRTC",
+      WS_OVERLAPPEDWINDOW | WS_VISIBLE | WS_CLIPCHILDREN,
+      CW_USEDEFAULT, CW_USEDEFAULT, CW_USEDEFAULT, CW_USEDEFAULT,
+      NULL, NULL, GetModuleHandle(NULL), this);
+
+  ::SendMessage(wnd_, WM_SETFONT, reinterpret_cast<WPARAM>(GetDefaultFont()),
+                TRUE);
+
+  CreateChildWindows();
+  SwitchToConnectUI();
+
+  return wnd_ != NULL;
+}
+
+bool MainWnd::Destroy() {
+  BOOL ret = FALSE;
+  if (IsWindow()) {
+    ret = ::DestroyWindow(wnd_);
+  }
+
+  return ret != FALSE;
+}
+
+void MainWnd::RegisterObserver(MainWndCallback* callback) {
+  callback_ = callback;
+}
+
+bool MainWnd::IsWindow() {
+  return wnd_ && ::IsWindow(wnd_) != FALSE;
+}
+
+bool MainWnd::PreTranslateMessage(MSG* msg) {
+  bool ret = false;
+  if (msg->message == WM_CHAR) {
+    if (msg->wParam == VK_TAB) {
+      HandleTabbing();
+      ret = true;
+    } else if (msg->wParam == VK_RETURN) {
+      OnDefaultAction();
+      ret = true;
+    } else if (msg->wParam == VK_ESCAPE) {
+      if (callback_) {
+        if (ui_ == STREAMING) {
+          callback_->DisconnectFromCurrentPeer();
+        } else {
+          callback_->DisconnectFromServer();
+        }
+      }
+    }
+  } else if (msg->hwnd == NULL && msg->message == UI_THREAD_CALLBACK) {
+    callback_->UIThreadCallback(static_cast<int>(msg->wParam),
+                                reinterpret_cast<void*>(msg->lParam));
+    ret = true;
+  }
+  return ret;
+}
+
+void MainWnd::SwitchToConnectUI() {
+  ASSERT(IsWindow());
+  LayoutPeerListUI(false);
+  ui_ = CONNECT_TO_SERVER;
+  LayoutConnectUI(true);
+  ::SetFocus(edit1_);
+
+  if (auto_connect_)
+    ::PostMessage(button_, BM_CLICK, 0, 0);
+}
+
+void MainWnd::SwitchToPeerList(const Peers& peers) {
+  LayoutConnectUI(false);
+
+  ::SendMessage(listbox_, LB_RESETCONTENT, 0, 0);
+
+  AddListBoxItem(listbox_, "List of currently connected peers:", -1);
+  Peers::const_iterator i = peers.begin();
+  for (; i != peers.end(); ++i)
+    AddListBoxItem(listbox_, i->second.c_str(), i->first);
+
+  ui_ = LIST_PEERS;
+  LayoutPeerListUI(true);
+  ::SetFocus(listbox_);
+
+  if (auto_call_ && peers.begin() != peers.end()) {
+    // Get the number of items in the list
+    LRESULT count = ::SendMessage(listbox_, LB_GETCOUNT, 0, 0);
+    if (count != LB_ERR) {
+      // Select the last item in the list
+      LRESULT selection = ::SendMessage(listbox_, LB_SETCURSEL , count - 1, 0);
+      if (selection != LB_ERR)
+        ::PostMessage(wnd_, WM_COMMAND, MAKEWPARAM(GetDlgCtrlID(listbox_),
+                                                   LBN_DBLCLK),
+                      reinterpret_cast<LPARAM>(listbox_));
+    }
+  }
+}
+
+void MainWnd::SwitchToStreamingUI() {
+  LayoutConnectUI(false);
+  LayoutPeerListUI(false);
+  ui_ = STREAMING;
+}
+
+void MainWnd::MessageBox(const char* caption, const char* text, bool is_error) {
+  DWORD flags = MB_OK;
+  if (is_error)
+    flags |= MB_ICONERROR;
+
+  ::MessageBoxA(handle(), text, caption, flags);
+}
+
+
+void MainWnd::StartLocalRenderer(webrtc::VideoTrackInterface* local_video) {
+  local_renderer_.reset(new VideoRenderer(handle(), 1, 1, local_video));
+}
+
+void MainWnd::StopLocalRenderer() {
+  local_renderer_.reset();
+}
+
+void MainWnd::StartRemoteRenderer(webrtc::VideoTrackInterface* remote_video) {
+  remote_renderer_.reset(new VideoRenderer(handle(), 1, 1, remote_video));
+}
+
+void MainWnd::StopRemoteRenderer() {
+  remote_renderer_.reset();
+}
+
+void MainWnd::QueueUIThreadCallback(int msg_id, void* data) {
+  ::PostThreadMessage(ui_thread_id_, UI_THREAD_CALLBACK,
+      static_cast<WPARAM>(msg_id), reinterpret_cast<LPARAM>(data));
+}
+
+void MainWnd::OnPaint() {
+  PAINTSTRUCT ps;
+  ::BeginPaint(handle(), &ps);
+
+  RECT rc;
+  ::GetClientRect(handle(), &rc);
+
+  VideoRenderer* local_renderer = local_renderer_.get();
+  VideoRenderer* remote_renderer = remote_renderer_.get();
+  if (ui_ == STREAMING && remote_renderer && local_renderer) {
+    AutoLock<VideoRenderer> local_lock(local_renderer);
+    AutoLock<VideoRenderer> remote_lock(remote_renderer);
+
+    const BITMAPINFO& bmi = remote_renderer->bmi();
+    int height = abs(bmi.bmiHeader.biHeight);
+    int width = bmi.bmiHeader.biWidth;
+
+    const uint8* image = remote_renderer->image();
+    if (image != NULL) {
+      HDC dc_mem = ::CreateCompatibleDC(ps.hdc);
+      ::SetStretchBltMode(dc_mem, HALFTONE);
+
+      // Set the map mode so that the ratio will be maintained for us.
+      HDC all_dc[] = { ps.hdc, dc_mem };
+      for (int i = 0; i < ARRAY_SIZE(all_dc); ++i) {
+        SetMapMode(all_dc[i], MM_ISOTROPIC);
+        SetWindowExtEx(all_dc[i], width, height, NULL);
+        SetViewportExtEx(all_dc[i], rc.right, rc.bottom, NULL);
+      }
+
+      HBITMAP bmp_mem = ::CreateCompatibleBitmap(ps.hdc, rc.right, rc.bottom);
+      HGDIOBJ bmp_old = ::SelectObject(dc_mem, bmp_mem);
+
+      POINT logical_area = { rc.right, rc.bottom };
+      DPtoLP(ps.hdc, &logical_area, 1);
+
+      HBRUSH brush = ::CreateSolidBrush(RGB(0, 0, 0));
+      RECT logical_rect = {0, 0, logical_area.x, logical_area.y };
+      ::FillRect(dc_mem, &logical_rect, brush);
+      ::DeleteObject(brush);
+
+      int x = (logical_area.x / 2) - (width / 2);
+      int y = (logical_area.y / 2) - (height / 2);
+
+      StretchDIBits(dc_mem, x, y, width, height,
+                    0, 0, width, height, image, &bmi, DIB_RGB_COLORS, SRCCOPY);
+
+      if ((rc.right - rc.left) > 200 && (rc.bottom - rc.top) > 200) {
+        const BITMAPINFO& bmi = local_renderer->bmi();
+        image = local_renderer->image();
+        int thumb_width = bmi.bmiHeader.biWidth / 4;
+        int thumb_height = abs(bmi.bmiHeader.biHeight) / 4;
+        StretchDIBits(dc_mem,
+            logical_area.x - thumb_width - 10,
+            logical_area.y - thumb_height - 10,
+            thumb_width, thumb_height,
+            0, 0, bmi.bmiHeader.biWidth, -bmi.bmiHeader.biHeight,
+            image, &bmi, DIB_RGB_COLORS, SRCCOPY);
+      }
+
+      BitBlt(ps.hdc, 0, 0, logical_area.x, logical_area.y,
+             dc_mem, 0, 0, SRCCOPY);
+
+      // Cleanup.
+      ::SelectObject(dc_mem, bmp_old);
+      ::DeleteObject(bmp_mem);
+      ::DeleteDC(dc_mem);
+    } else {
+      // We're still waiting for the video stream to be initialized.
+      HBRUSH brush = ::CreateSolidBrush(RGB(0, 0, 0));
+      ::FillRect(ps.hdc, &rc, brush);
+      ::DeleteObject(brush);
+
+      HGDIOBJ old_font = ::SelectObject(ps.hdc, GetDefaultFont());
+      ::SetTextColor(ps.hdc, RGB(0xff, 0xff, 0xff));
+      ::SetBkMode(ps.hdc, TRANSPARENT);
+
+      std::string text(kConnecting);
+      if (!local_renderer->image()) {
+        text += kNoVideoStreams;
+      } else {
+        text += kNoIncomingStream;
+      }
+      ::DrawTextA(ps.hdc, text.c_str(), -1, &rc,
+          DT_SINGLELINE | DT_CENTER | DT_VCENTER);
+      ::SelectObject(ps.hdc, old_font);
+    }
+  } else {
+    HBRUSH brush = ::CreateSolidBrush(::GetSysColor(COLOR_WINDOW));
+    ::FillRect(ps.hdc, &rc, brush);
+    ::DeleteObject(brush);
+  }
+
+  ::EndPaint(handle(), &ps);
+}
+
+void MainWnd::OnDestroyed() {
+  PostQuitMessage(0);
+}
+
+void MainWnd::OnDefaultAction() {
+  if (!callback_)
+    return;
+  if (ui_ == CONNECT_TO_SERVER) {
+    std::string server(GetWindowText(edit1_));
+    std::string port_str(GetWindowText(edit2_));
+    int port = port_str.length() ? atoi(port_str.c_str()) : 0;
+    callback_->StartLogin(server, port);
+  } else if (ui_ == LIST_PEERS) {
+    LRESULT sel = ::SendMessage(listbox_, LB_GETCURSEL, 0, 0);
+    if (sel != LB_ERR) {
+      LRESULT peer_id = ::SendMessage(listbox_, LB_GETITEMDATA, sel, 0);
+      if (peer_id != -1 && callback_) {
+        callback_->ConnectToPeer(peer_id);
+      }
+    }
+  } else {
+    MessageBoxA(wnd_, "OK!", "Yeah", MB_OK);
+  }
+}
+
+bool MainWnd::OnMessage(UINT msg, WPARAM wp, LPARAM lp, LRESULT* result) {
+  switch (msg) {
+    case WM_ERASEBKGND:
+      *result = TRUE;
+      return true;
+
+    case WM_PAINT:
+      OnPaint();
+      return true;
+
+    case WM_SETFOCUS:
+      if (ui_ == CONNECT_TO_SERVER) {
+        SetFocus(edit1_);
+      } else if (ui_ == LIST_PEERS) {
+        SetFocus(listbox_);
+      }
+      return true;
+
+    case WM_SIZE:
+      if (ui_ == CONNECT_TO_SERVER) {
+        LayoutConnectUI(true);
+      } else if (ui_ == LIST_PEERS) {
+        LayoutPeerListUI(true);
+      }
+      break;
+
+    case WM_CTLCOLORSTATIC:
+      *result = reinterpret_cast<LRESULT>(GetSysColorBrush(COLOR_WINDOW));
+      return true;
+
+    case WM_COMMAND:
+      if (button_ == reinterpret_cast<HWND>(lp)) {
+        if (BN_CLICKED == HIWORD(wp))
+          OnDefaultAction();
+      } else if (listbox_ == reinterpret_cast<HWND>(lp)) {
+        if (LBN_DBLCLK == HIWORD(wp)) {
+          OnDefaultAction();
+        }
+      }
+      return true;
+
+    case WM_CLOSE:
+      if (callback_)
+        callback_->Close();
+      break;
+  }
+  return false;
+}
+
+// static
+LRESULT CALLBACK MainWnd::WndProc(HWND hwnd, UINT msg, WPARAM wp, LPARAM lp) {
+  MainWnd* me = reinterpret_cast<MainWnd*>(
+      ::GetWindowLongPtr(hwnd, GWLP_USERDATA));
+  if (!me && WM_CREATE == msg) {
+    CREATESTRUCT* cs = reinterpret_cast<CREATESTRUCT*>(lp);
+    me = reinterpret_cast<MainWnd*>(cs->lpCreateParams);
+    me->wnd_ = hwnd;
+    ::SetWindowLongPtr(hwnd, GWLP_USERDATA, reinterpret_cast<LONG_PTR>(me));
+  }
+
+  LRESULT result = 0;
+  if (me) {
+    void* prev_nested_msg = me->nested_msg_;
+    me->nested_msg_ = &msg;
+
+    bool handled = me->OnMessage(msg, wp, lp, &result);
+    if (WM_NCDESTROY == msg) {
+      me->destroyed_ = true;
+    } else if (!handled) {
+      result = ::DefWindowProc(hwnd, msg, wp, lp);
+    }
+
+    if (me->destroyed_ && prev_nested_msg == NULL) {
+      me->OnDestroyed();
+      me->wnd_ = NULL;
+      me->destroyed_ = false;
+    }
+
+    me->nested_msg_ = prev_nested_msg;
+  } else {
+    result = ::DefWindowProc(hwnd, msg, wp, lp);
+  }
+
+  return result;
+}
+
+// static
+bool MainWnd::RegisterWindowClass() {
+  if (wnd_class_)
+    return true;
+
+  WNDCLASSEX wcex = { sizeof(WNDCLASSEX) };
+  wcex.style = CS_DBLCLKS;
+  wcex.hInstance = GetModuleHandle(NULL);
+  wcex.hbrBackground = reinterpret_cast<HBRUSH>(COLOR_WINDOW + 1);
+  wcex.hCursor = ::LoadCursor(NULL, IDC_ARROW);
+  wcex.lpfnWndProc = &WndProc;
+  wcex.lpszClassName = kClassName;
+  wnd_class_ = ::RegisterClassEx(&wcex);
+  ASSERT(wnd_class_ != 0);
+  return wnd_class_ != 0;
+}
+
+void MainWnd::CreateChildWindow(HWND* wnd, MainWnd::ChildWindowID id,
+                                const wchar_t* class_name, DWORD control_style,
+                                DWORD ex_style) {
+  if (::IsWindow(*wnd))
+    return;
+
+  // Child windows are invisible at first, and shown after being resized.
+  DWORD style = WS_CHILD | control_style;
+  *wnd = ::CreateWindowEx(ex_style, class_name, L"", style,
+                          100, 100, 100, 100, wnd_,
+                          reinterpret_cast<HMENU>(id),
+                          GetModuleHandle(NULL), NULL);
+  ASSERT(::IsWindow(*wnd) != FALSE);
+  ::SendMessage(*wnd, WM_SETFONT, reinterpret_cast<WPARAM>(GetDefaultFont()),
+                TRUE);
+}
+
+void MainWnd::CreateChildWindows() {
+  // Create the child windows in tab order.
+  CreateChildWindow(&label1_, LABEL1_ID, L"Static", ES_CENTER | ES_READONLY, 0);
+  CreateChildWindow(&edit1_, EDIT_ID, L"Edit",
+                    ES_LEFT | ES_NOHIDESEL | WS_TABSTOP, WS_EX_CLIENTEDGE);
+  CreateChildWindow(&label2_, LABEL2_ID, L"Static", ES_CENTER | ES_READONLY, 0);
+  CreateChildWindow(&edit2_, EDIT_ID, L"Edit",
+                    ES_LEFT | ES_NOHIDESEL | WS_TABSTOP, WS_EX_CLIENTEDGE);
+  CreateChildWindow(&button_, BUTTON_ID, L"Button", BS_CENTER | WS_TABSTOP, 0);
+
+  CreateChildWindow(&listbox_, LISTBOX_ID, L"ListBox",
+                    LBS_HASSTRINGS | LBS_NOTIFY, WS_EX_CLIENTEDGE);
+
+  ::SetWindowTextA(edit1_, server_.c_str());
+  ::SetWindowTextA(edit2_, port_.c_str());
+}
+
+void MainWnd::LayoutConnectUI(bool show) {
+  struct Windows {
+    HWND wnd;
+    const wchar_t* text;
+    size_t width;
+    size_t height;
+  } windows[] = {
+    { label1_, L"Server" },
+    { edit1_, L"XXXyyyYYYgggXXXyyyYYYggg" },
+    { label2_, L":" },
+    { edit2_, L"XyXyX" },
+    { button_, L"Connect" },
+  };
+
+  if (show) {
+    const size_t kSeparator = 5;
+    size_t total_width = (ARRAYSIZE(windows) - 1) * kSeparator;
+
+    for (size_t i = 0; i < ARRAYSIZE(windows); ++i) {
+      CalculateWindowSizeForText(windows[i].wnd, windows[i].text,
+                                 &windows[i].width, &windows[i].height);
+      total_width += windows[i].width;
+    }
+
+    RECT rc;
+    ::GetClientRect(wnd_, &rc);
+    size_t x = (rc.right / 2) - (total_width / 2);
+    size_t y = rc.bottom / 2;
+    for (size_t i = 0; i < ARRAYSIZE(windows); ++i) {
+      size_t top = y - (windows[i].height / 2);
+      ::MoveWindow(windows[i].wnd, static_cast<int>(x), static_cast<int>(top),
+                   static_cast<int>(windows[i].width),
+                   static_cast<int>(windows[i].height),
+                   TRUE);
+      x += kSeparator + windows[i].width;
+      if (windows[i].text[0] != 'X')
+        ::SetWindowText(windows[i].wnd, windows[i].text);
+      ::ShowWindow(windows[i].wnd, SW_SHOWNA);
+    }
+  } else {
+    for (size_t i = 0; i < ARRAYSIZE(windows); ++i) {
+      ::ShowWindow(windows[i].wnd, SW_HIDE);
+    }
+  }
+}
+
+void MainWnd::LayoutPeerListUI(bool show) {
+  if (show) {
+    RECT rc;
+    ::GetClientRect(wnd_, &rc);
+    ::MoveWindow(listbox_, 0, 0, rc.right, rc.bottom, TRUE);
+    ::ShowWindow(listbox_, SW_SHOWNA);
+  } else {
+    ::ShowWindow(listbox_, SW_HIDE);
+    InvalidateRect(wnd_, NULL, TRUE);
+  }
+}
+
+void MainWnd::HandleTabbing() {
+  bool shift = ((::GetAsyncKeyState(VK_SHIFT) & 0x8000) != 0);
+  UINT next_cmd = shift ? GW_HWNDPREV : GW_HWNDNEXT;
+  UINT loop_around_cmd = shift ? GW_HWNDLAST : GW_HWNDFIRST;
+  HWND focus = GetFocus(), next;
+  do {
+    next = ::GetWindow(focus, next_cmd);
+    if (IsWindowVisible(next) &&
+        (GetWindowLong(next, GWL_STYLE) & WS_TABSTOP)) {
+      break;
+    }
+
+    if (!next) {
+      next = ::GetWindow(focus, loop_around_cmd);
+      if (IsWindowVisible(next) &&
+          (GetWindowLong(next, GWL_STYLE) & WS_TABSTOP)) {
+        break;
+      }
+    }
+    focus = next;
+  } while (true);
+  ::SetFocus(next);
+}
+
+//
+// MainWnd::VideoRenderer
+//
+
+MainWnd::VideoRenderer::VideoRenderer(
+    HWND wnd, int width, int height,
+    webrtc::VideoTrackInterface* track_to_render)
+    : wnd_(wnd), rendered_track_(track_to_render) {
+  ::InitializeCriticalSection(&buffer_lock_);
+  ZeroMemory(&bmi_, sizeof(bmi_));
+  bmi_.bmiHeader.biSize = sizeof(BITMAPINFOHEADER);
+  bmi_.bmiHeader.biPlanes = 1;
+  bmi_.bmiHeader.biBitCount = 32;
+  bmi_.bmiHeader.biCompression = BI_RGB;
+  bmi_.bmiHeader.biWidth = width;
+  bmi_.bmiHeader.biHeight = -height;
+  bmi_.bmiHeader.biSizeImage = width * height *
+                              (bmi_.bmiHeader.biBitCount >> 3);
+  rendered_track_->AddRenderer(this);
+}
+
+MainWnd::VideoRenderer::~VideoRenderer() {
+  rendered_track_->RemoveRenderer(this);
+  ::DeleteCriticalSection(&buffer_lock_);
+}
+
+void MainWnd::VideoRenderer::SetSize(int width, int height) {
+  AutoLock<VideoRenderer> lock(this);
+
+  if (width == bmi_.bmiHeader.biWidth && height == bmi_.bmiHeader.biHeight) {
+    return;
+  }
+
+  bmi_.bmiHeader.biWidth = width;
+  bmi_.bmiHeader.biHeight = -height;
+  bmi_.bmiHeader.biSizeImage = width * height *
+                               (bmi_.bmiHeader.biBitCount >> 3);
+  image_.reset(new uint8[bmi_.bmiHeader.biSizeImage]);
+}
+
+void MainWnd::VideoRenderer::RenderFrame(
+    const cricket::VideoFrame* video_frame) {
+  if (!video_frame)
+    return;
+
+  {
+    AutoLock<VideoRenderer> lock(this);
+
+    const cricket::VideoFrame* frame =
+        video_frame->GetCopyWithRotationApplied();
+
+    SetSize(static_cast<int>(frame->GetWidth()),
+            static_cast<int>(frame->GetHeight()));
+
+    ASSERT(image_.get() != NULL);
+    frame->ConvertToRgbBuffer(cricket::FOURCC_ARGB,
+                              image_.get(),
+                              bmi_.bmiHeader.biSizeImage,
+                              bmi_.bmiHeader.biWidth *
+                              bmi_.bmiHeader.biBitCount / 8);
+  }
+  InvalidateRect(wnd_, NULL, TRUE);
+}
diff --git a/examples/peerconnection/client/main_wnd.h b/examples/peerconnection/client/main_wnd.h
new file mode 100644
index 0000000..c11e94d
--- /dev/null
+++ b/examples/peerconnection/client/main_wnd.h
@@ -0,0 +1,200 @@
+/*
+ *  Copyright 2012 The WebRTC Project Authors. All rights reserved.
+ *
+ *  Use of this source code is governed by a BSD-style license
+ *  that can be found in the LICENSE file in the root of the source
+ *  tree. An additional intellectual property rights grant can be found
+ *  in the file PATENTS.  All contributing project authors may
+ *  be found in the AUTHORS file in the root of the source tree.
+ */
+
+#ifndef PEERCONNECTION_SAMPLES_CLIENT_MAIN_WND_H_
+#define PEERCONNECTION_SAMPLES_CLIENT_MAIN_WND_H_
+#pragma once
+
+#include <map>
+#include <string>
+
+#include "talk/app/webrtc/mediastreaminterface.h"
+#include "webrtc/examples/peerconnection/client/peer_connection_client.h"
+#include "talk/media/base/mediachannel.h"
+#include "talk/media/base/videocommon.h"
+#include "talk/media/base/videoframe.h"
+#include "talk/media/base/videorenderer.h"
+#include "webrtc/base/win32.h"
+
+class MainWndCallback {
+ public:
+  virtual void StartLogin(const std::string& server, int port) = 0;
+  virtual void DisconnectFromServer() = 0;
+  virtual void ConnectToPeer(int peer_id) = 0;
+  virtual void DisconnectFromCurrentPeer() = 0;
+  virtual void UIThreadCallback(int msg_id, void* data) = 0;
+  virtual void Close() = 0;
+ protected:
+  virtual ~MainWndCallback() {}
+};
+
+// Pure virtual interface for the main window.
+class MainWindow {
+ public:
+  virtual ~MainWindow() {}
+
+  enum UI {
+    CONNECT_TO_SERVER,
+    LIST_PEERS,
+    STREAMING,
+  };
+
+  virtual void RegisterObserver(MainWndCallback* callback) = 0;
+
+  virtual bool IsWindow() = 0;
+  virtual void MessageBox(const char* caption, const char* text,
+                          bool is_error) = 0;
+
+  virtual UI current_ui() = 0;
+
+  virtual void SwitchToConnectUI() = 0;
+  virtual void SwitchToPeerList(const Peers& peers) = 0;
+  virtual void SwitchToStreamingUI() = 0;
+
+  virtual void StartLocalRenderer(webrtc::VideoTrackInterface* local_video) = 0;
+  virtual void StopLocalRenderer() = 0;
+  virtual void StartRemoteRenderer(webrtc::VideoTrackInterface* remote_video) = 0;
+  virtual void StopRemoteRenderer() = 0;
+
+  virtual void QueueUIThreadCallback(int msg_id, void* data) = 0;
+};
+
+#ifdef WIN32
+
+class MainWnd : public MainWindow {
+ public:
+  static const wchar_t kClassName[];
+
+  enum WindowMessages {
+    UI_THREAD_CALLBACK = WM_APP + 1,
+  };
+
+  MainWnd(const char* server, int port, bool auto_connect, bool auto_call);
+  ~MainWnd();
+
+  bool Create();
+  bool Destroy();
+  bool PreTranslateMessage(MSG* msg);
+
+  virtual void RegisterObserver(MainWndCallback* callback);
+  virtual bool IsWindow();
+  virtual void SwitchToConnectUI();
+  virtual void SwitchToPeerList(const Peers& peers);
+  virtual void SwitchToStreamingUI();
+  virtual void MessageBox(const char* caption, const char* text,
+                          bool is_error);
+  virtual UI current_ui() { return ui_; }
+
+  virtual void StartLocalRenderer(webrtc::VideoTrackInterface* local_video);
+  virtual void StopLocalRenderer();
+  virtual void StartRemoteRenderer(webrtc::VideoTrackInterface* remote_video);
+  virtual void StopRemoteRenderer();
+
+  virtual void QueueUIThreadCallback(int msg_id, void* data);
+
+  HWND handle() const { return wnd_; }
+
+  class VideoRenderer : public webrtc::VideoRendererInterface {
+   public:
+    VideoRenderer(HWND wnd, int width, int height,
+                  webrtc::VideoTrackInterface* track_to_render);
+    virtual ~VideoRenderer();
+
+    void Lock() {
+      ::EnterCriticalSection(&buffer_lock_);
+    }
+
+    void Unlock() {
+      ::LeaveCriticalSection(&buffer_lock_);
+    }
+
+    // VideoRendererInterface implementation
+    virtual void SetSize(int width, int height);
+    virtual void RenderFrame(const cricket::VideoFrame* frame);
+
+    const BITMAPINFO& bmi() const { return bmi_; }
+    const uint8* image() const { return image_.get(); }
+
+   protected:
+    enum {
+      SET_SIZE,
+      RENDER_FRAME,
+    };
+
+    HWND wnd_;
+    BITMAPINFO bmi_;
+    rtc::scoped_ptr<uint8[]> image_;
+    CRITICAL_SECTION buffer_lock_;
+    rtc::scoped_refptr<webrtc::VideoTrackInterface> rendered_track_;
+  };
+
+  // A little helper class to make sure we always to proper locking and
+  // unlocking when working with VideoRenderer buffers.
+  template <typename T>
+  class AutoLock {
+   public:
+    explicit AutoLock(T* obj) : obj_(obj) { obj_->Lock(); }
+    ~AutoLock() { obj_->Unlock(); }
+   protected:
+    T* obj_;
+  };
+
+ protected:
+  enum ChildWindowID {
+    EDIT_ID = 1,
+    BUTTON_ID,
+    LABEL1_ID,
+    LABEL2_ID,
+    LISTBOX_ID,
+  };
+
+  void OnPaint();
+  void OnDestroyed();
+
+  void OnDefaultAction();
+
+  bool OnMessage(UINT msg, WPARAM wp, LPARAM lp, LRESULT* result);
+
+  static LRESULT CALLBACK WndProc(HWND hwnd, UINT msg, WPARAM wp, LPARAM lp);
+  static bool RegisterWindowClass();
+
+  void CreateChildWindow(HWND* wnd, ChildWindowID id, const wchar_t* class_name,
+                         DWORD control_style, DWORD ex_style);
+  void CreateChildWindows();
+
+  void LayoutConnectUI(bool show);
+  void LayoutPeerListUI(bool show);
+
+  void HandleTabbing();
+
+ private:
+  rtc::scoped_ptr<VideoRenderer> local_renderer_;
+  rtc::scoped_ptr<VideoRenderer> remote_renderer_;
+  UI ui_;
+  HWND wnd_;
+  DWORD ui_thread_id_;
+  HWND edit1_;
+  HWND edit2_;
+  HWND label1_;
+  HWND label2_;
+  HWND button_;
+  HWND listbox_;
+  bool destroyed_;
+  void* nested_msg_;
+  MainWndCallback* callback_;
+  static ATOM wnd_class_;
+  std::string server_;
+  std::string port_;
+  bool auto_connect_;
+  bool auto_call_;
+};
+#endif  // WIN32
+
+#endif  // PEERCONNECTION_SAMPLES_CLIENT_MAIN_WND_H_
diff --git a/examples/peerconnection/client/peer_connection_client.cc b/examples/peerconnection/client/peer_connection_client.cc
new file mode 100644
index 0000000..d49ce35
--- /dev/null
+++ b/examples/peerconnection/client/peer_connection_client.cc
@@ -0,0 +1,514 @@
+/*
+ *  Copyright 2012 The WebRTC Project Authors. All rights reserved.
+ *
+ *  Use of this source code is governed by a BSD-style license
+ *  that can be found in the LICENSE file in the root of the source
+ *  tree. An additional intellectual property rights grant can be found
+ *  in the file PATENTS.  All contributing project authors may
+ *  be found in the AUTHORS file in the root of the source tree.
+ */
+
+#include "webrtc/examples/peerconnection/client/peer_connection_client.h"
+
+#include "webrtc/examples/peerconnection/client/defaults.h"
+#include "webrtc/base/common.h"
+#include "webrtc/base/logging.h"
+#include "webrtc/base/nethelpers.h"
+#include "webrtc/base/stringutils.h"
+
+#ifdef WIN32
+#include "webrtc/base/win32socketserver.h"
+#endif
+
+using rtc::sprintfn;
+
+namespace {
+
+// This is our magical hangup signal.
+const char kByeMessage[] = "BYE";
+// Delay between server connection retries, in milliseconds
+const int kReconnectDelay = 2000;
+
+rtc::AsyncSocket* CreateClientSocket(int family) {
+#ifdef WIN32
+  rtc::Win32Socket* sock = new rtc::Win32Socket();
+  sock->CreateT(family, SOCK_STREAM);
+  return sock;
+#elif defined(WEBRTC_POSIX)
+  rtc::Thread* thread = rtc::Thread::Current();
+  ASSERT(thread != NULL);
+  return thread->socketserver()->CreateAsyncSocket(family, SOCK_STREAM);
+#else
+#error Platform not supported.
+#endif
+}
+
+}
+
+PeerConnectionClient::PeerConnectionClient()
+  : callback_(NULL),
+    resolver_(NULL),
+    state_(NOT_CONNECTED),
+    my_id_(-1) {
+}
+
+PeerConnectionClient::~PeerConnectionClient() {
+}
+
+void PeerConnectionClient::InitSocketSignals() {
+  ASSERT(control_socket_.get() != NULL);
+  ASSERT(hanging_get_.get() != NULL);
+  control_socket_->SignalCloseEvent.connect(this,
+      &PeerConnectionClient::OnClose);
+  hanging_get_->SignalCloseEvent.connect(this,
+      &PeerConnectionClient::OnClose);
+  control_socket_->SignalConnectEvent.connect(this,
+      &PeerConnectionClient::OnConnect);
+  hanging_get_->SignalConnectEvent.connect(this,
+      &PeerConnectionClient::OnHangingGetConnect);
+  control_socket_->SignalReadEvent.connect(this,
+      &PeerConnectionClient::OnRead);
+  hanging_get_->SignalReadEvent.connect(this,
+      &PeerConnectionClient::OnHangingGetRead);
+}
+
+int PeerConnectionClient::id() const {
+  return my_id_;
+}
+
+bool PeerConnectionClient::is_connected() const {
+  return my_id_ != -1;
+}
+
+const Peers& PeerConnectionClient::peers() const {
+  return peers_;
+}
+
+void PeerConnectionClient::RegisterObserver(
+    PeerConnectionClientObserver* callback) {
+  ASSERT(!callback_);
+  callback_ = callback;
+}
+
+void PeerConnectionClient::Connect(const std::string& server, int port,
+                                   const std::string& client_name) {
+  ASSERT(!server.empty());
+  ASSERT(!client_name.empty());
+
+  if (state_ != NOT_CONNECTED) {
+    LOG(WARNING)
+        << "The client must not be connected before you can call Connect()";
+    callback_->OnServerConnectionFailure();
+    return;
+  }
+
+  if (server.empty() || client_name.empty()) {
+    callback_->OnServerConnectionFailure();
+    return;
+  }
+
+  if (port <= 0)
+    port = kDefaultServerPort;
+
+  server_address_.SetIP(server);
+  server_address_.SetPort(port);
+  client_name_ = client_name;
+
+  if (server_address_.IsUnresolved()) {
+    state_ = RESOLVING;
+    resolver_ = new rtc::AsyncResolver();
+    resolver_->SignalDone.connect(this, &PeerConnectionClient::OnResolveResult);
+    resolver_->Start(server_address_);
+  } else {
+    DoConnect();
+  }
+}
+
+void PeerConnectionClient::OnResolveResult(
+    rtc::AsyncResolverInterface* resolver) {
+  if (resolver_->GetError() != 0) {
+    callback_->OnServerConnectionFailure();
+    resolver_->Destroy(false);
+    resolver_ = NULL;
+    state_ = NOT_CONNECTED;
+  } else {
+    server_address_ = resolver_->address();
+    DoConnect();
+  }
+}
+
+void PeerConnectionClient::DoConnect() {
+  control_socket_.reset(CreateClientSocket(server_address_.ipaddr().family()));
+  hanging_get_.reset(CreateClientSocket(server_address_.ipaddr().family()));
+  InitSocketSignals();
+  char buffer[1024];
+  sprintfn(buffer, sizeof(buffer),
+           "GET /sign_in?%s HTTP/1.0\r\n\r\n", client_name_.c_str());
+  onconnect_data_ = buffer;
+
+  bool ret = ConnectControlSocket();
+  if (ret)
+    state_ = SIGNING_IN;
+  if (!ret) {
+    callback_->OnServerConnectionFailure();
+  }
+}
+
+bool PeerConnectionClient::SendToPeer(int peer_id, const std::string& message) {
+  if (state_ != CONNECTED)
+    return false;
+
+  ASSERT(is_connected());
+  ASSERT(control_socket_->GetState() == rtc::Socket::CS_CLOSED);
+  if (!is_connected() || peer_id == -1)
+    return false;
+
+  char headers[1024];
+  sprintfn(headers, sizeof(headers),
+      "POST /message?peer_id=%i&to=%i HTTP/1.0\r\n"
+      "Content-Length: %i\r\n"
+      "Content-Type: text/plain\r\n"
+      "\r\n",
+      my_id_, peer_id, message.length());
+  onconnect_data_ = headers;
+  onconnect_data_ += message;
+  return ConnectControlSocket();
+}
+
+bool PeerConnectionClient::SendHangUp(int peer_id) {
+  return SendToPeer(peer_id, kByeMessage);
+}
+
+bool PeerConnectionClient::IsSendingMessage() {
+  return state_ == CONNECTED &&
+         control_socket_->GetState() != rtc::Socket::CS_CLOSED;
+}
+
+bool PeerConnectionClient::SignOut() {
+  if (state_ == NOT_CONNECTED || state_ == SIGNING_OUT)
+    return true;
+
+  if (hanging_get_->GetState() != rtc::Socket::CS_CLOSED)
+    hanging_get_->Close();
+
+  if (control_socket_->GetState() == rtc::Socket::CS_CLOSED) {
+    state_ = SIGNING_OUT;
+
+    if (my_id_ != -1) {
+      char buffer[1024];
+      sprintfn(buffer, sizeof(buffer),
+          "GET /sign_out?peer_id=%i HTTP/1.0\r\n\r\n", my_id_);
+      onconnect_data_ = buffer;
+      return ConnectControlSocket();
+    } else {
+      // Can occur if the app is closed before we finish connecting.
+      return true;
+    }
+  } else {
+    state_ = SIGNING_OUT_WAITING;
+  }
+
+  return true;
+}
+
+void PeerConnectionClient::Close() {
+  control_socket_->Close();
+  hanging_get_->Close();
+  onconnect_data_.clear();
+  peers_.clear();
+  if (resolver_ != NULL) {
+    resolver_->Destroy(false);
+    resolver_ = NULL;
+  }
+  my_id_ = -1;
+  state_ = NOT_CONNECTED;
+}
+
+bool PeerConnectionClient::ConnectControlSocket() {
+  ASSERT(control_socket_->GetState() == rtc::Socket::CS_CLOSED);
+  int err = control_socket_->Connect(server_address_);
+  if (err == SOCKET_ERROR) {
+    Close();
+    return false;
+  }
+  return true;
+}
+
+void PeerConnectionClient::OnConnect(rtc::AsyncSocket* socket) {
+  ASSERT(!onconnect_data_.empty());
+  size_t sent = socket->Send(onconnect_data_.c_str(), onconnect_data_.length());
+  ASSERT(sent == onconnect_data_.length());
+  RTC_UNUSED(sent);
+  onconnect_data_.clear();
+}
+
+void PeerConnectionClient::OnHangingGetConnect(rtc::AsyncSocket* socket) {
+  char buffer[1024];
+  sprintfn(buffer, sizeof(buffer),
+           "GET /wait?peer_id=%i HTTP/1.0\r\n\r\n", my_id_);
+  int len = static_cast<int>(strlen(buffer));
+  int sent = socket->Send(buffer, len);
+  ASSERT(sent == len);
+  RTC_UNUSED2(sent, len);
+}
+
+void PeerConnectionClient::OnMessageFromPeer(int peer_id,
+                                             const std::string& message) {
+  if (message.length() == (sizeof(kByeMessage) - 1) &&
+      message.compare(kByeMessage) == 0) {
+    callback_->OnPeerDisconnected(peer_id);
+  } else {
+    callback_->OnMessageFromPeer(peer_id, message);
+  }
+}
+
+bool PeerConnectionClient::GetHeaderValue(const std::string& data,
+                                          size_t eoh,
+                                          const char* header_pattern,
+                                          size_t* value) {
+  ASSERT(value != NULL);
+  size_t found = data.find(header_pattern);
+  if (found != std::string::npos && found < eoh) {
+    *value = atoi(&data[found + strlen(header_pattern)]);
+    return true;
+  }
+  return false;
+}
+
+bool PeerConnectionClient::GetHeaderValue(const std::string& data, size_t eoh,
+                                          const char* header_pattern,
+                                          std::string* value) {
+  ASSERT(value != NULL);
+  size_t found = data.find(header_pattern);
+  if (found != std::string::npos && found < eoh) {
+    size_t begin = found + strlen(header_pattern);
+    size_t end = data.find("\r\n", begin);
+    if (end == std::string::npos)
+      end = eoh;
+    value->assign(data.substr(begin, end - begin));
+    return true;
+  }
+  return false;
+}
+
+bool PeerConnectionClient::ReadIntoBuffer(rtc::AsyncSocket* socket,
+                                          std::string* data,
+                                          size_t* content_length) {
+  char buffer[0xffff];
+  do {
+    int bytes = socket->Recv(buffer, sizeof(buffer));
+    if (bytes <= 0)
+      break;
+    data->append(buffer, bytes);
+  } while (true);
+
+  bool ret = false;
+  size_t i = data->find("\r\n\r\n");
+  if (i != std::string::npos) {
+    LOG(INFO) << "Headers received";
+    if (GetHeaderValue(*data, i, "\r\nContent-Length: ", content_length)) {
+      size_t total_response_size = (i + 4) + *content_length;
+      if (data->length() >= total_response_size) {
+        ret = true;
+        std::string should_close;
+        const char kConnection[] = "\r\nConnection: ";
+        if (GetHeaderValue(*data, i, kConnection, &should_close) &&
+            should_close.compare("close") == 0) {
+          socket->Close();
+          // Since we closed the socket, there was no notification delivered
+          // to us.  Compensate by letting ourselves know.
+          OnClose(socket, 0);
+        }
+      } else {
+        // We haven't received everything.  Just continue to accept data.
+      }
+    } else {
+      LOG(LS_ERROR) << "No content length field specified by the server.";
+    }
+  }
+  return ret;
+}
+
+void PeerConnectionClient::OnRead(rtc::AsyncSocket* socket) {
+  size_t content_length = 0;
+  if (ReadIntoBuffer(socket, &control_data_, &content_length)) {
+    size_t peer_id = 0, eoh = 0;
+    bool ok = ParseServerResponse(control_data_, content_length, &peer_id,
+                                  &eoh);
+    if (ok) {
+      if (my_id_ == -1) {
+        // First response.  Let's store our server assigned ID.
+        ASSERT(state_ == SIGNING_IN);
+        my_id_ = static_cast<int>(peer_id);
+        ASSERT(my_id_ != -1);
+
+        // The body of the response will be a list of already connected peers.
+        if (content_length) {
+          size_t pos = eoh + 4;
+          while (pos < control_data_.size()) {
+            size_t eol = control_data_.find('\n', pos);
+            if (eol == std::string::npos)
+              break;
+            int id = 0;
+            std::string name;
+            bool connected;
+            if (ParseEntry(control_data_.substr(pos, eol - pos), &name, &id,
+                           &connected) && id != my_id_) {
+              peers_[id] = name;
+              callback_->OnPeerConnected(id, name);
+            }
+            pos = eol + 1;
+          }
+        }
+        ASSERT(is_connected());
+        callback_->OnSignedIn();
+      } else if (state_ == SIGNING_OUT) {
+        Close();
+        callback_->OnDisconnected();
+      } else if (state_ == SIGNING_OUT_WAITING) {
+        SignOut();
+      }
+    }
+
+    control_data_.clear();
+
+    if (state_ == SIGNING_IN) {
+      ASSERT(hanging_get_->GetState() == rtc::Socket::CS_CLOSED);
+      state_ = CONNECTED;
+      hanging_get_->Connect(server_address_);
+    }
+  }
+}
+
+void PeerConnectionClient::OnHangingGetRead(rtc::AsyncSocket* socket) {
+  LOG(INFO) << __FUNCTION__;
+  size_t content_length = 0;
+  if (ReadIntoBuffer(socket, &notification_data_, &content_length)) {
+    size_t peer_id = 0, eoh = 0;
+    bool ok = ParseServerResponse(notification_data_, content_length,
+                                  &peer_id, &eoh);
+
+    if (ok) {
+      // Store the position where the body begins.
+      size_t pos = eoh + 4;
+
+      if (my_id_ == static_cast<int>(peer_id)) {
+        // A notification about a new member or a member that just
+        // disconnected.
+        int id = 0;
+        std::string name;
+        bool connected = false;
+        if (ParseEntry(notification_data_.substr(pos), &name, &id,
+                       &connected)) {
+          if (connected) {
+            peers_[id] = name;
+            callback_->OnPeerConnected(id, name);
+          } else {
+            peers_.erase(id);
+            callback_->OnPeerDisconnected(id);
+          }
+        }
+      } else {
+        OnMessageFromPeer(static_cast<int>(peer_id),
+                          notification_data_.substr(pos));
+      }
+    }
+
+    notification_data_.clear();
+  }
+
+  if (hanging_get_->GetState() == rtc::Socket::CS_CLOSED &&
+      state_ == CONNECTED) {
+    hanging_get_->Connect(server_address_);
+  }
+}
+
+bool PeerConnectionClient::ParseEntry(const std::string& entry,
+                                      std::string* name,
+                                      int* id,
+                                      bool* connected) {
+  ASSERT(name != NULL);
+  ASSERT(id != NULL);
+  ASSERT(connected != NULL);
+  ASSERT(!entry.empty());
+
+  *connected = false;
+  size_t separator = entry.find(',');
+  if (separator != std::string::npos) {
+    *id = atoi(&entry[separator + 1]);
+    name->assign(entry.substr(0, separator));
+    separator = entry.find(',', separator + 1);
+    if (separator != std::string::npos) {
+      *connected = atoi(&entry[separator + 1]) ? true : false;
+    }
+  }
+  return !name->empty();
+}
+
+int PeerConnectionClient::GetResponseStatus(const std::string& response) {
+  int status = -1;
+  size_t pos = response.find(' ');
+  if (pos != std::string::npos)
+    status = atoi(&response[pos + 1]);
+  return status;
+}
+
+bool PeerConnectionClient::ParseServerResponse(const std::string& response,
+                                               size_t content_length,
+                                               size_t* peer_id,
+                                               size_t* eoh) {
+  int status = GetResponseStatus(response.c_str());
+  if (status != 200) {
+    LOG(LS_ERROR) << "Received error from server";
+    Close();
+    callback_->OnDisconnected();
+    return false;
+  }
+
+  *eoh = response.find("\r\n\r\n");
+  ASSERT(*eoh != std::string::npos);
+  if (*eoh == std::string::npos)
+    return false;
+
+  *peer_id = -1;
+
+  // See comment in peer_channel.cc for why we use the Pragma header and
+  // not e.g. "X-Peer-Id".
+  GetHeaderValue(response, *eoh, "\r\nPragma: ", peer_id);
+
+  return true;
+}
+
+void PeerConnectionClient::OnClose(rtc::AsyncSocket* socket, int err) {
+  LOG(INFO) << __FUNCTION__;
+
+  socket->Close();
+
+#ifdef WIN32
+  if (err != WSAECONNREFUSED) {
+#else
+  if (err != ECONNREFUSED) {
+#endif
+    if (socket == hanging_get_.get()) {
+      if (state_ == CONNECTED) {
+        hanging_get_->Close();
+        hanging_get_->Connect(server_address_);
+      }
+    } else {
+      callback_->OnMessageSent(err);
+    }
+  } else {
+    if (socket == control_socket_.get()) {
+      LOG(WARNING) << "Connection refused; retrying in 2 seconds";
+      rtc::Thread::Current()->PostDelayed(kReconnectDelay, this, 0);
+    } else {
+      Close();
+      callback_->OnDisconnected();
+    }
+  }
+}
+
+void PeerConnectionClient::OnMessage(rtc::Message* msg) {
+  // ignore msg; there is currently only one supported message ("retry")
+  DoConnect();
+}
diff --git a/examples/peerconnection/client/peer_connection_client.h b/examples/peerconnection/client/peer_connection_client.h
new file mode 100644
index 0000000..5b5787b
--- /dev/null
+++ b/examples/peerconnection/client/peer_connection_client.h
@@ -0,0 +1,123 @@
+/*
+ *  Copyright 2011 The WebRTC Project Authors. All rights reserved.
+ *
+ *  Use of this source code is governed by a BSD-style license
+ *  that can be found in the LICENSE file in the root of the source
+ *  tree. An additional intellectual property rights grant can be found
+ *  in the file PATENTS.  All contributing project authors may
+ *  be found in the AUTHORS file in the root of the source tree.
+ */
+
+#ifndef PEERCONNECTION_SAMPLES_CLIENT_PEER_CONNECTION_CLIENT_H_
+#define PEERCONNECTION_SAMPLES_CLIENT_PEER_CONNECTION_CLIENT_H_
+#pragma once
+
+#include <map>
+#include <string>
+
+#include "webrtc/base/nethelpers.h"
+#include "webrtc/base/physicalsocketserver.h"
+#include "webrtc/base/scoped_ptr.h"
+#include "webrtc/base/signalthread.h"
+#include "webrtc/base/sigslot.h"
+
+typedef std::map<int, std::string> Peers;
+
+struct PeerConnectionClientObserver {
+  virtual void OnSignedIn() = 0;  // Called when we're logged on.
+  virtual void OnDisconnected() = 0;
+  virtual void OnPeerConnected(int id, const std::string& name) = 0;
+  virtual void OnPeerDisconnected(int peer_id) = 0;
+  virtual void OnMessageFromPeer(int peer_id, const std::string& message) = 0;
+  virtual void OnMessageSent(int err) = 0;
+  virtual void OnServerConnectionFailure() = 0;
+
+ protected:
+  virtual ~PeerConnectionClientObserver() {}
+};
+
+class PeerConnectionClient : public sigslot::has_slots<>,
+                             public rtc::MessageHandler {
+ public:
+  enum State {
+    NOT_CONNECTED,
+    RESOLVING,
+    SIGNING_IN,
+    CONNECTED,
+    SIGNING_OUT_WAITING,
+    SIGNING_OUT,
+  };
+
+  PeerConnectionClient();
+  ~PeerConnectionClient();
+
+  int id() const;
+  bool is_connected() const;
+  const Peers& peers() const;
+
+  void RegisterObserver(PeerConnectionClientObserver* callback);
+
+  void Connect(const std::string& server, int port,
+               const std::string& client_name);
+
+  bool SendToPeer(int peer_id, const std::string& message);
+  bool SendHangUp(int peer_id);
+  bool IsSendingMessage();
+
+  bool SignOut();
+
+  // implements the MessageHandler interface
+  void OnMessage(rtc::Message* msg);
+
+ protected:
+  void DoConnect();
+  void Close();
+  void InitSocketSignals();
+  bool ConnectControlSocket();
+  void OnConnect(rtc::AsyncSocket* socket);
+  void OnHangingGetConnect(rtc::AsyncSocket* socket);
+  void OnMessageFromPeer(int peer_id, const std::string& message);
+
+  // Quick and dirty support for parsing HTTP header values.
+  bool GetHeaderValue(const std::string& data, size_t eoh,
+                      const char* header_pattern, size_t* value);
+
+  bool GetHeaderValue(const std::string& data, size_t eoh,
+                      const char* header_pattern, std::string* value);
+
+  // Returns true if the whole response has been read.
+  bool ReadIntoBuffer(rtc::AsyncSocket* socket, std::string* data,
+                      size_t* content_length);
+
+  void OnRead(rtc::AsyncSocket* socket);
+
+  void OnHangingGetRead(rtc::AsyncSocket* socket);
+
+  // Parses a single line entry in the form "<name>,<id>,<connected>"
+  bool ParseEntry(const std::string& entry, std::string* name, int* id,
+                  bool* connected);
+
+  int GetResponseStatus(const std::string& response);
+
+  bool ParseServerResponse(const std::string& response, size_t content_length,
+                           size_t* peer_id, size_t* eoh);
+
+  void OnClose(rtc::AsyncSocket* socket, int err);
+
+  void OnResolveResult(rtc::AsyncResolverInterface* resolver);
+
+  PeerConnectionClientObserver* callback_;
+  rtc::SocketAddress server_address_;
+  rtc::AsyncResolver* resolver_;
+  rtc::scoped_ptr<rtc::AsyncSocket> control_socket_;
+  rtc::scoped_ptr<rtc::AsyncSocket> hanging_get_;
+  std::string onconnect_data_;
+  std::string control_data_;
+  std::string notification_data_;
+  std::string client_name_;
+  Peers peers_;
+  State state_;
+  int my_id_;
+};
+
+#endif  // PEERCONNECTION_SAMPLES_CLIENT_PEER_CONNECTION_CLIENT_H_
diff --git a/examples/peerconnection/server/data_socket.cc b/examples/peerconnection/server/data_socket.cc
new file mode 100644
index 0000000..60e40a6
--- /dev/null
+++ b/examples/peerconnection/server/data_socket.cc
@@ -0,0 +1,293 @@
+/*
+ *  Copyright 2011 The WebRTC Project Authors. All rights reserved.
+ *
+ *  Use of this source code is governed by a BSD-style license
+ *  that can be found in the LICENSE file in the root of the source
+ *  tree. An additional intellectual property rights grant can be found
+ *  in the file PATENTS.  All contributing project authors may
+ *  be found in the AUTHORS file in the root of the source tree.
+ */
+
+#include "webrtc/examples/peerconnection/server/data_socket.h"
+
+#include <ctype.h>
+#include <stdio.h>
+#include <stdlib.h>
+#include <string.h>
+#if defined(WEBRTC_POSIX)
+#include <unistd.h>
+#endif
+
+#include "webrtc/examples/peerconnection/server/utils.h"
+
+static const char kHeaderTerminator[] = "\r\n\r\n";
+static const int kHeaderTerminatorLength = sizeof(kHeaderTerminator) - 1;
+
+// static
+const char DataSocket::kCrossOriginAllowHeaders[] =
+    "Access-Control-Allow-Origin: *\r\n"
+    "Access-Control-Allow-Credentials: true\r\n"
+    "Access-Control-Allow-Methods: POST, GET, OPTIONS\r\n"
+    "Access-Control-Allow-Headers: Content-Type, "
+        "Content-Length, Connection, Cache-Control\r\n"
+    "Access-Control-Expose-Headers: Content-Length, X-Peer-Id\r\n";
+
+#if defined(WIN32)
+class WinsockInitializer {
+  static WinsockInitializer singleton;
+
+  WinsockInitializer() {
+    WSADATA data;
+    WSAStartup(MAKEWORD(1, 0), &data);
+  }
+
+ public:
+  ~WinsockInitializer() { WSACleanup(); }
+};
+WinsockInitializer WinsockInitializer::singleton;
+#endif
+
+//
+// SocketBase
+//
+
+bool SocketBase::Create() {
+  assert(!valid());
+  socket_ = ::socket(AF_INET, SOCK_STREAM, 0);
+  return valid();
+}
+
+void SocketBase::Close() {
+  if (socket_ != INVALID_SOCKET) {
+    closesocket(socket_);
+    socket_ = INVALID_SOCKET;
+  }
+}
+
+//
+// DataSocket
+//
+
+std::string DataSocket::request_arguments() const {
+  size_t args = request_path_.find('?');
+  if (args != std::string::npos)
+    return request_path_.substr(args + 1);
+  return "";
+}
+
+bool DataSocket::PathEquals(const char* path) const {
+  assert(path);
+  size_t args = request_path_.find('?');
+  if (args != std::string::npos)
+    return request_path_.substr(0, args).compare(path) == 0;
+  return request_path_.compare(path) == 0;
+}
+
+bool DataSocket::OnDataAvailable(bool* close_socket) {
+  assert(valid());
+  char buffer[0xfff] = {0};
+  int bytes = recv(socket_, buffer, sizeof(buffer), 0);
+  if (bytes == SOCKET_ERROR || bytes == 0) {
+    *close_socket = true;
+    return false;
+  }
+
+  *close_socket = false;
+
+  bool ret = true;
+  if (headers_received()) {
+    if (method_ != POST) {
+      // unexpectedly received data.
+      ret = false;
+    } else {
+      data_.append(buffer, bytes);
+    }
+  } else {
+    request_headers_.append(buffer, bytes);
+    size_t found = request_headers_.find(kHeaderTerminator);
+    if (found != std::string::npos) {
+      data_ = request_headers_.substr(found + kHeaderTerminatorLength);
+      request_headers_.resize(found + kHeaderTerminatorLength);
+      ret = ParseHeaders();
+    }
+  }
+  return ret;
+}
+
+bool DataSocket::Send(const std::string& data) const {
+  return send(socket_, data.data(), static_cast<int>(data.length()), 0) !=
+      SOCKET_ERROR;
+}
+
+bool DataSocket::Send(const std::string& status, bool connection_close,
+                      const std::string& content_type,
+                      const std::string& extra_headers,
+                      const std::string& data) const {
+  assert(valid());
+  assert(!status.empty());
+  std::string buffer("HTTP/1.1 " + status + "\r\n");
+
+  buffer += "Server: PeerConnectionTestServer/0.1\r\n"
+            "Cache-Control: no-cache\r\n";
+
+  if (connection_close)
+    buffer += "Connection: close\r\n";
+
+  if (!content_type.empty())
+    buffer += "Content-Type: " + content_type + "\r\n";
+
+  buffer += "Content-Length: " + int2str(static_cast<int>(data.size())) +
+            "\r\n";
+
+  if (!extra_headers.empty()) {
+    buffer += extra_headers;
+    // Extra headers are assumed to have a separator per header.
+  }
+
+  buffer += kCrossOriginAllowHeaders;
+
+  buffer += "\r\n";
+  buffer += data;
+
+  return Send(buffer);
+}
+
+void DataSocket::Clear() {
+  method_ = INVALID;
+  content_length_ = 0;
+  content_type_.clear();
+  request_path_.clear();
+  request_headers_.clear();
+  data_.clear();
+}
+
+bool DataSocket::ParseHeaders() {
+  assert(!request_headers_.empty());
+  assert(method_ == INVALID);
+  size_t i = request_headers_.find("\r\n");
+  if (i == std::string::npos)
+    return false;
+
+  if (!ParseMethodAndPath(request_headers_.data(), i))
+    return false;
+
+  assert(method_ != INVALID);
+  assert(!request_path_.empty());
+
+  if (method_ == POST) {
+    const char* headers = request_headers_.data() + i + 2;
+    size_t len = request_headers_.length() - i - 2;
+    if (!ParseContentLengthAndType(headers, len))
+      return false;
+  }
+
+  return true;
+}
+
+bool DataSocket::ParseMethodAndPath(const char* begin, size_t len) {
+  struct {
+    const char* method_name;
+    size_t method_name_len;
+    RequestMethod id;
+  } supported_methods[] = {
+    { "GET", 3, GET },
+    { "POST", 4, POST },
+    { "OPTIONS", 7, OPTIONS },
+  };
+
+  const char* path = NULL;
+  for (size_t i = 0; i < ARRAYSIZE(supported_methods); ++i) {
+    if (len > supported_methods[i].method_name_len &&
+        isspace(begin[supported_methods[i].method_name_len]) &&
+        strncmp(begin, supported_methods[i].method_name,
+                supported_methods[i].method_name_len) == 0) {
+      method_ = supported_methods[i].id;
+      path = begin + supported_methods[i].method_name_len;
+      break;
+    }
+  }
+
+  const char* end = begin + len;
+  if (!path || path >= end)
+    return false;
+
+  ++path;
+  begin = path;
+  while (!isspace(*path) && path < end)
+    ++path;
+
+  request_path_.assign(begin, path - begin);
+
+  return true;
+}
+
+bool DataSocket::ParseContentLengthAndType(const char* headers, size_t length) {
+  assert(content_length_ == 0);
+  assert(content_type_.empty());
+
+  const char* end = headers + length;
+  while (headers && headers < end) {
+    if (!isspace(headers[0])) {
+      static const char kContentLength[] = "Content-Length:";
+      static const char kContentType[] = "Content-Type:";
+      if ((headers + ARRAYSIZE(kContentLength)) < end &&
+          strncmp(headers, kContentLength,
+                  ARRAYSIZE(kContentLength) - 1) == 0) {
+        headers += ARRAYSIZE(kContentLength) - 1;
+        while (headers[0] == ' ')
+          ++headers;
+        content_length_ = atoi(headers);
+      } else if ((headers + ARRAYSIZE(kContentType)) < end &&
+                 strncmp(headers, kContentType,
+                         ARRAYSIZE(kContentType) - 1) == 0) {
+        headers += ARRAYSIZE(kContentType) - 1;
+        while (headers[0] == ' ')
+          ++headers;
+        const char* type_end = strstr(headers, "\r\n");
+        if (type_end == NULL)
+          type_end = end;
+        content_type_.assign(headers, type_end);
+      }
+    } else {
+      ++headers;
+    }
+    headers = strstr(headers, "\r\n");
+    if (headers)
+      headers += 2;
+  }
+
+  return !content_type_.empty() && content_length_ != 0;
+}
+
+//
+// ListeningSocket
+//
+
+bool ListeningSocket::Listen(unsigned short port) {
+  assert(valid());
+  int enabled = 1;
+  setsockopt(socket_, SOL_SOCKET, SO_REUSEADDR,
+      reinterpret_cast<const char*>(&enabled), sizeof(enabled));
+  struct sockaddr_in addr = {0};
+  addr.sin_family = AF_INET;
+  addr.sin_addr.s_addr = htonl(INADDR_ANY);
+  addr.sin_port = htons(port);
+  if (bind(socket_, reinterpret_cast<const sockaddr*>(&addr),
+           sizeof(addr)) == SOCKET_ERROR) {
+    printf("bind failed\n");
+    return false;
+  }
+  return listen(socket_, 5) != SOCKET_ERROR;
+}
+
+DataSocket* ListeningSocket::Accept() const {
+  assert(valid());
+  struct sockaddr_in addr = {0};
+  socklen_t size = sizeof(addr);
+  NativeSocket client =
+      accept(socket_, reinterpret_cast<sockaddr*>(&addr), &size);
+  if (client == INVALID_SOCKET)
+    return NULL;
+
+  return new DataSocket(client);
+}
diff --git a/examples/peerconnection/server/data_socket.h b/examples/peerconnection/server/data_socket.h
new file mode 100644
index 0000000..454ad39
--- /dev/null
+++ b/examples/peerconnection/server/data_socket.h
@@ -0,0 +1,153 @@
+/*
+ *  Copyright 2011 The WebRTC Project Authors. All rights reserved.
+ *
+ *  Use of this source code is governed by a BSD-style license
+ *  that can be found in the LICENSE file in the root of the source
+ *  tree. An additional intellectual property rights grant can be found
+ *  in the file PATENTS.  All contributing project authors may
+ *  be found in the AUTHORS file in the root of the source tree.
+ */
+
+#ifndef TALK_EXAMPLES_PEERCONNECTION_SERVER_DATA_SOCKET_H_
+#define TALK_EXAMPLES_PEERCONNECTION_SERVER_DATA_SOCKET_H_
+#pragma once
+
+#ifdef WIN32
+#include <winsock2.h>
+typedef int socklen_t;
+typedef SOCKET NativeSocket;
+#else
+#include <netinet/in.h>
+#include <sys/select.h>
+#include <sys/socket.h>
+#define closesocket close
+typedef int NativeSocket;
+
+#ifndef SOCKET_ERROR
+#define SOCKET_ERROR (-1)
+#endif
+
+#ifndef INVALID_SOCKET
+#define INVALID_SOCKET  static_cast<NativeSocket>(-1)
+#endif
+#endif
+
+#include <string>
+
+class SocketBase {
+ public:
+  SocketBase() : socket_(INVALID_SOCKET) { }
+  explicit SocketBase(NativeSocket socket) : socket_(socket) { }
+  ~SocketBase() { Close(); }
+
+  NativeSocket socket() const { return socket_; }
+  bool valid() const { return socket_ != INVALID_SOCKET; }
+
+  bool Create();
+  void Close();
+
+ protected:
+  NativeSocket socket_;
+};
+
+// Represents an HTTP server socket.
+class DataSocket : public SocketBase {
+ public:
+  enum RequestMethod {
+    INVALID,
+    GET,
+    POST,
+    OPTIONS,
+  };
+
+  explicit DataSocket(NativeSocket socket)
+      : SocketBase(socket),
+        method_(INVALID),
+        content_length_(0) {
+  }
+
+  ~DataSocket() {
+  }
+
+  static const char kCrossOriginAllowHeaders[];
+
+  bool headers_received() const { return method_ != INVALID; }
+
+  RequestMethod method() const { return method_; }
+
+  const std::string& request_path() const { return request_path_; }
+  std::string request_arguments() const;
+
+  const std::string& data() const { return data_; }
+
+  const std::string& content_type() const { return content_type_; }
+
+  size_t content_length() const { return content_length_; }
+
+  bool request_received() const {
+    return headers_received() && (method_ != POST || data_received());
+  }
+
+  bool data_received() const {
+    return method_ != POST || data_.length() >= content_length_;
+  }
+
+  // Checks if the request path (minus arguments) matches a given path.
+  bool PathEquals(const char* path) const;
+
+  // Called when we have received some data from clients.
+  // Returns false if an error occurred.
+  bool OnDataAvailable(bool* close_socket);
+
+  // Send a raw buffer of bytes.
+  bool Send(const std::string& data) const;
+
+  // Send an HTTP response.  The |status| should start with a valid HTTP
+  // response code, followed by a string.  E.g. "200 OK".
+  // If |connection_close| is set to true, an extra "Connection: close" HTTP
+  // header will be included.  |content_type| is the mime content type, not
+  // including the "Content-Type: " string.
+  // |extra_headers| should be either empty or a list of headers where each
+  // header terminates with "\r\n".
+  // |data| is the body of the message.  It's length will be specified via
+  // a "Content-Length" header.
+  bool Send(const std::string& status, bool connection_close,
+            const std::string& content_type,
+            const std::string& extra_headers, const std::string& data) const;
+
+  // Clears all held state and prepares the socket for receiving a new request.
+  void Clear();
+
+ protected:
+  // A fairly relaxed HTTP header parser.  Parses the method, path and
+  // content length (POST only) of a request.
+  // Returns true if a valid request was received and no errors occurred.
+  bool ParseHeaders();
+
+  // Figures out whether the request is a GET or POST and what path is
+  // being requested.
+  bool ParseMethodAndPath(const char* begin, size_t len);
+
+  // Determines the length of the body and it's mime type.
+  bool ParseContentLengthAndType(const char* headers, size_t length);
+
+ protected:
+  RequestMethod method_;
+  size_t content_length_;
+  std::string content_type_;
+  std::string request_path_;
+  std::string request_headers_;
+  std::string data_;
+};
+
+// The server socket.  Accepts connections and generates DataSocket instances
+// for each new connection.
+class ListeningSocket : public SocketBase {
+ public:
+  ListeningSocket() {}
+
+  bool Listen(unsigned short port);
+  DataSocket* Accept() const;
+};
+
+#endif  // TALK_EXAMPLES_PEERCONNECTION_SERVER_DATA_SOCKET_H_
diff --git a/examples/peerconnection/server/main.cc b/examples/peerconnection/server/main.cc
new file mode 100644
index 0000000..e69de9c
--- /dev/null
+++ b/examples/peerconnection/server/main.cc
@@ -0,0 +1,173 @@
+/*
+ *  Copyright 2011 The WebRTC Project Authors. All rights reserved.
+ *
+ *  Use of this source code is governed by a BSD-style license
+ *  that can be found in the LICENSE file in the root of the source
+ *  tree. An additional intellectual property rights grant can be found
+ *  in the file PATENTS.  All contributing project authors may
+ *  be found in the AUTHORS file in the root of the source tree.
+ */
+
+#include <stdio.h>
+#include <stdlib.h>
+#include <string.h>
+
+#include <vector>
+
+#include "webrtc/examples/peerconnection/server/data_socket.h"
+#include "webrtc/examples/peerconnection/server/peer_channel.h"
+#include "webrtc/examples/peerconnection/server/utils.h"
+#include "webrtc/base/flags.h"
+
+DEFINE_bool(help, false, "Prints this message");
+DEFINE_int(port, 8888, "The port on which to listen.");
+
+static const size_t kMaxConnections = (FD_SETSIZE - 2);
+
+void HandleBrowserRequest(DataSocket* ds, bool* quit) {
+  assert(ds && ds->valid());
+  assert(quit);
+
+  const std::string& path = ds->request_path();
+
+  *quit = (path.compare("/quit") == 0);
+
+  if (*quit) {
+    ds->Send("200 OK", true, "text/html", "",
+             "<html><body>Quitting...</body></html>");
+  } else if (ds->method() == DataSocket::OPTIONS) {
+    // We'll get this when a browsers do cross-resource-sharing requests.
+    // The headers to allow cross-origin script support will be set inside
+    // Send.
+    ds->Send("200 OK", true, "", "", "");
+  } else {
+    // Here we could write some useful output back to the browser depending on
+    // the path.
+    printf("Received an invalid request: %s\n", ds->request_path().c_str());
+    ds->Send("500 Sorry", true, "text/html", "",
+             "<html><body>Sorry, not yet implemented</body></html>");
+  }
+}
+
+int main(int argc, char** argv) {
+  rtc::FlagList::SetFlagsFromCommandLine(&argc, argv, true);
+  if (FLAG_help) {
+    rtc::FlagList::Print(NULL, false);
+    return 0;
+  }
+
+  // Abort if the user specifies a port that is outside the allowed
+  // range [1, 65535].
+  if ((FLAG_port < 1) || (FLAG_port > 65535)) {
+    printf("Error: %i is not a valid port.\n", FLAG_port);
+    return -1;
+  }
+
+  ListeningSocket listener;
+  if (!listener.Create()) {
+    printf("Failed to create server socket\n");
+    return -1;
+  } else if (!listener.Listen(FLAG_port)) {
+    printf("Failed to listen on server socket\n");
+    return -1;
+  }
+
+  printf("Server listening on port %i\n", FLAG_port);
+
+  PeerChannel clients;
+  typedef std::vector<DataSocket*> SocketArray;
+  SocketArray sockets;
+  bool quit = false;
+  while (!quit) {
+    fd_set socket_set;
+    FD_ZERO(&socket_set);
+    if (listener.valid())
+      FD_SET(listener.socket(), &socket_set);
+
+    for (SocketArray::iterator i = sockets.begin(); i != sockets.end(); ++i)
+      FD_SET((*i)->socket(), &socket_set);
+
+    struct timeval timeout = { 10, 0 };
+    if (select(FD_SETSIZE, &socket_set, NULL, NULL, &timeout) == SOCKET_ERROR) {
+      printf("select failed\n");
+      break;
+    }
+
+    for (SocketArray::iterator i = sockets.begin(); i != sockets.end(); ++i) {
+      DataSocket* s = *i;
+      bool socket_done = true;
+      if (FD_ISSET(s->socket(), &socket_set)) {
+        if (s->OnDataAvailable(&socket_done) && s->request_received()) {
+          ChannelMember* member = clients.Lookup(s);
+          if (member || PeerChannel::IsPeerConnection(s)) {
+            if (!member) {
+              if (s->PathEquals("/sign_in")) {
+                clients.AddMember(s);
+              } else {
+                printf("No member found for: %s\n",
+                    s->request_path().c_str());
+                s->Send("500 Error", true, "text/plain", "",
+                        "Peer most likely gone.");
+              }
+            } else if (member->is_wait_request(s)) {
+              // no need to do anything.
+              socket_done = false;
+            } else {
+              ChannelMember* target = clients.IsTargetedRequest(s);
+              if (target) {
+                member->ForwardRequestToPeer(s, target);
+              } else if (s->PathEquals("/sign_out")) {
+                s->Send("200 OK", true, "text/plain", "", "");
+              } else {
+                printf("Couldn't find target for request: %s\n",
+                    s->request_path().c_str());
+                s->Send("500 Error", true, "text/plain", "",
+                        "Peer most likely gone.");
+              }
+            }
+          } else {
+            HandleBrowserRequest(s, &quit);
+            if (quit) {
+              printf("Quitting...\n");
+              FD_CLR(listener.socket(), &socket_set);
+              listener.Close();
+              clients.CloseAll();
+            }
+          }
+        }
+      } else {
+        socket_done = false;
+      }
+
+      if (socket_done) {
+        printf("Disconnecting socket\n");
+        clients.OnClosing(s);
+        assert(s->valid());  // Close must not have been called yet.
+        FD_CLR(s->socket(), &socket_set);
+        delete (*i);
+        i = sockets.erase(i);
+        if (i == sockets.end())
+          break;
+      }
+    }
+
+    clients.CheckForTimeout();
+
+    if (FD_ISSET(listener.socket(), &socket_set)) {
+      DataSocket* s = listener.Accept();
+      if (sockets.size() >= kMaxConnections) {
+        delete s;  // sorry, that's all we can take.
+        printf("Connection limit reached\n");
+      } else {
+        sockets.push_back(s);
+        printf("New connection...\n");
+      }
+    }
+  }
+
+  for (SocketArray::iterator i = sockets.begin(); i != sockets.end(); ++i)
+    delete (*i);
+  sockets.clear();
+
+  return 0;
+}
diff --git a/examples/peerconnection/server/peer_channel.cc b/examples/peerconnection/server/peer_channel.cc
new file mode 100644
index 0000000..150e5de
--- /dev/null
+++ b/examples/peerconnection/server/peer_channel.cc
@@ -0,0 +1,363 @@
+/*
+ *  Copyright 2011 The WebRTC Project Authors. All rights reserved.
+ *
+ *  Use of this source code is governed by a BSD-style license
+ *  that can be found in the LICENSE file in the root of the source
+ *  tree. An additional intellectual property rights grant can be found
+ *  in the file PATENTS.  All contributing project authors may
+ *  be found in the AUTHORS file in the root of the source tree.
+ */
+
+#include "webrtc/examples/peerconnection/server/peer_channel.h"
+
+#include <stdio.h>
+#include <stdlib.h>
+#include <string.h>
+
+#include <algorithm>
+
+#include "webrtc/examples/peerconnection/server/data_socket.h"
+#include "webrtc/examples/peerconnection/server/utils.h"
+#include "webrtc/base/stringutils.h"
+
+using rtc::sprintfn;
+
+// Set to the peer id of the originator when messages are being
+// exchanged between peers, but set to the id of the receiving peer
+// itself when notifications are sent from the server about the state
+// of other peers.
+//
+// WORKAROUND: Since support for CORS varies greatly from one browser to the
+// next, we don't use a custom name for our peer-id header (originally it was
+// "X-Peer-Id: ").  Instead, we use a "simple header", "Pragma" which should
+// always be exposed to CORS requests.  There is a special CORS header devoted
+// to exposing proprietary headers (Access-Control-Expose-Headers), however
+// at this point it is not working correctly in some popular browsers.
+static const char kPeerIdHeader[] = "Pragma: ";
+
+static const char* kRequestPaths[] = {
+  "/wait", "/sign_out", "/message",
+};
+
+enum RequestPathIndex {
+  kWait,
+  kSignOut,
+  kMessage,
+};
+
+const size_t kMaxNameLength = 512;
+
+//
+// ChannelMember
+//
+
+int ChannelMember::s_member_id_ = 0;
+
+ChannelMember::ChannelMember(DataSocket* socket)
+  : waiting_socket_(NULL), id_(++s_member_id_),
+    connected_(true), timestamp_(time(NULL)) {
+  assert(socket);
+  assert(socket->method() == DataSocket::GET);
+  assert(socket->PathEquals("/sign_in"));
+  name_ = socket->request_arguments();  // TODO: urldecode
+  if (name_.empty())
+    name_ = "peer_" + int2str(id_);
+  else if (name_.length() > kMaxNameLength)
+    name_.resize(kMaxNameLength);
+
+  std::replace(name_.begin(), name_.end(), ',', '_');
+}
+
+ChannelMember::~ChannelMember() {
+}
+
+bool ChannelMember::is_wait_request(DataSocket* ds) const {
+  return ds && ds->PathEquals(kRequestPaths[kWait]);
+}
+
+bool ChannelMember::TimedOut() {
+  return waiting_socket_ == NULL && (time(NULL) - timestamp_) > 30;
+}
+
+std::string ChannelMember::GetPeerIdHeader() const {
+  std::string ret(kPeerIdHeader + int2str(id_) + "\r\n");
+  return ret;
+}
+
+bool ChannelMember::NotifyOfOtherMember(const ChannelMember& other) {
+  assert(&other != this);
+  QueueResponse("200 OK", "text/plain", GetPeerIdHeader(),
+                other.GetEntry());
+  return true;
+}
+
+// Returns a string in the form "name,id,connected\n".
+std::string ChannelMember::GetEntry() const {
+  assert(name_.length() <= kMaxNameLength);
+
+  // name, 11-digit int, 1-digit bool, newline, null
+  char entry[kMaxNameLength + 15];
+  sprintfn(entry, sizeof(entry), "%s,%d,%d\n",
+           name_.substr(0, kMaxNameLength).c_str(), id_, connected_);
+  return entry;
+}
+
+void ChannelMember::ForwardRequestToPeer(DataSocket* ds, ChannelMember* peer) {
+  assert(peer);
+  assert(ds);
+
+  std::string extra_headers(GetPeerIdHeader());
+
+  if (peer == this) {
+    ds->Send("200 OK", true, ds->content_type(), extra_headers,
+             ds->data());
+  } else {
+    printf("Client %s sending to %s\n",
+        name_.c_str(), peer->name().c_str());
+    peer->QueueResponse("200 OK", ds->content_type(), extra_headers,
+                        ds->data());
+    ds->Send("200 OK", true, "text/plain", "", "");
+  }
+}
+
+void ChannelMember::OnClosing(DataSocket* ds) {
+  if (ds == waiting_socket_) {
+    waiting_socket_ = NULL;
+    timestamp_ = time(NULL);
+  }
+}
+
+void ChannelMember::QueueResponse(const std::string& status,
+                                  const std::string& content_type,
+                                  const std::string& extra_headers,
+                                  const std::string& data) {
+  if (waiting_socket_) {
+    assert(queue_.size() == 0);
+    assert(waiting_socket_->method() == DataSocket::GET);
+    bool ok = waiting_socket_->Send(status, true, content_type, extra_headers,
+                                    data);
+    if (!ok) {
+      printf("Failed to deliver data to waiting socket\n");
+    }
+    waiting_socket_ = NULL;
+    timestamp_ = time(NULL);
+  } else {
+    QueuedResponse qr;
+    qr.status = status;
+    qr.content_type = content_type;
+    qr.extra_headers = extra_headers;
+    qr.data = data;
+    queue_.push(qr);
+  }
+}
+
+void ChannelMember::SetWaitingSocket(DataSocket* ds) {
+  assert(ds->method() == DataSocket::GET);
+  if (ds && !queue_.empty()) {
+    assert(waiting_socket_ == NULL);
+    const QueuedResponse& response = queue_.front();
+    ds->Send(response.status, true, response.content_type,
+             response.extra_headers, response.data);
+    queue_.pop();
+  } else {
+    waiting_socket_ = ds;
+  }
+}
+
+//
+// PeerChannel
+//
+
+// static
+bool PeerChannel::IsPeerConnection(const DataSocket* ds) {
+  assert(ds);
+  return (ds->method() == DataSocket::POST && ds->content_length() > 0) ||
+         (ds->method() == DataSocket::GET && ds->PathEquals("/sign_in"));
+}
+
+ChannelMember* PeerChannel::Lookup(DataSocket* ds) const {
+  assert(ds);
+
+  if (ds->method() != DataSocket::GET && ds->method() != DataSocket::POST)
+    return NULL;
+
+  size_t i = 0;
+  for (; i < ARRAYSIZE(kRequestPaths); ++i) {
+    if (ds->PathEquals(kRequestPaths[i]))
+      break;
+  }
+
+  if (i == ARRAYSIZE(kRequestPaths))
+    return NULL;
+
+  std::string args(ds->request_arguments());
+  static const char kPeerId[] = "peer_id=";
+  size_t found = args.find(kPeerId);
+  if (found == std::string::npos)
+    return NULL;
+
+  int id = atoi(&args[found + ARRAYSIZE(kPeerId) - 1]);
+  Members::const_iterator iter = members_.begin();
+  for (; iter != members_.end(); ++iter) {
+    if (id == (*iter)->id()) {
+      if (i == kWait)
+        (*iter)->SetWaitingSocket(ds);
+      if (i == kSignOut)
+        (*iter)->set_disconnected();
+      return *iter;
+    }
+  }
+
+  return NULL;
+}
+
+ChannelMember* PeerChannel::IsTargetedRequest(const DataSocket* ds) const {
+  assert(ds);
+  // Regardless of GET or POST, we look for the peer_id parameter
+  // only in the request_path.
+  const std::string& path = ds->request_path();
+  size_t args = path.find('?');
+  if (args == std::string::npos)
+    return NULL;
+  size_t found;
+  const char kTargetPeerIdParam[] = "to=";
+  do {
+    found = path.find(kTargetPeerIdParam, args);
+    if (found == std::string::npos)
+      return NULL;
+    if (found == (args + 1) || path[found - 1] == '&') {
+      found += ARRAYSIZE(kTargetPeerIdParam) - 1;
+      break;
+    }
+    args = found + ARRAYSIZE(kTargetPeerIdParam) - 1;
+  } while (true);
+  int id = atoi(&path[found]);
+  Members::const_iterator i = members_.begin();
+  for (; i != members_.end(); ++i) {
+    if ((*i)->id() == id) {
+      return *i;
+    }
+  }
+  return NULL;
+}
+
+bool PeerChannel::AddMember(DataSocket* ds) {
+  assert(IsPeerConnection(ds));
+  ChannelMember* new_guy = new ChannelMember(ds);
+  Members failures;
+  BroadcastChangedState(*new_guy, &failures);
+  HandleDeliveryFailures(&failures);
+  members_.push_back(new_guy);
+
+  printf("New member added (total=%s): %s\n",
+      size_t2str(members_.size()).c_str(), new_guy->name().c_str());
+
+  // Let the newly connected peer know about other members of the channel.
+  std::string content_type;
+  std::string response = BuildResponseForNewMember(*new_guy, &content_type);
+  ds->Send("200 Added", true, content_type, new_guy->GetPeerIdHeader(),
+           response);
+  return true;
+}
+
+void PeerChannel::CloseAll() {
+  Members::const_iterator i = members_.begin();
+  for (; i != members_.end(); ++i) {
+    (*i)->QueueResponse("200 OK", "text/plain", "", "Server shutting down");
+  }
+  DeleteAll();
+}
+
+void PeerChannel::OnClosing(DataSocket* ds) {
+  for (Members::iterator i = members_.begin(); i != members_.end(); ++i) {
+    ChannelMember* m = (*i);
+    m->OnClosing(ds);
+    if (!m->connected()) {
+      i = members_.erase(i);
+      Members failures;
+      BroadcastChangedState(*m, &failures);
+      HandleDeliveryFailures(&failures);
+      delete m;
+      if (i == members_.end())
+        break;
+    }
+  }
+  printf("Total connected: %s\n", size_t2str(members_.size()).c_str());
+}
+
+void PeerChannel::CheckForTimeout() {
+  for (Members::iterator i = members_.begin(); i != members_.end(); ++i) {
+    ChannelMember* m = (*i);
+    if (m->TimedOut()) {
+      printf("Timeout: %s\n", m->name().c_str());
+      m->set_disconnected();
+      i = members_.erase(i);
+      Members failures;
+      BroadcastChangedState(*m, &failures);
+      HandleDeliveryFailures(&failures);
+      delete m;
+      if (i == members_.end())
+        break;
+    }
+  }
+}
+
+void PeerChannel::DeleteAll() {
+  for (Members::iterator i = members_.begin(); i != members_.end(); ++i)
+    delete (*i);
+  members_.clear();
+}
+
+void PeerChannel::BroadcastChangedState(const ChannelMember& member,
+                                        Members* delivery_failures) {
+  // This function should be called prior to DataSocket::Close().
+  assert(delivery_failures);
+
+  if (!member.connected()) {
+    printf("Member disconnected: %s\n", member.name().c_str());
+  }
+
+  Members::iterator i = members_.begin();
+  for (; i != members_.end(); ++i) {
+    if (&member != (*i)) {
+      if (!(*i)->NotifyOfOtherMember(member)) {
+        (*i)->set_disconnected();
+        delivery_failures->push_back(*i);
+        i = members_.erase(i);
+        if (i == members_.end())
+          break;
+      }
+    }
+  }
+}
+
+void PeerChannel::HandleDeliveryFailures(Members* failures) {
+  assert(failures);
+
+  while (!failures->empty()) {
+    Members::iterator i = failures->begin();
+    ChannelMember* member = *i;
+    assert(!member->connected());
+    failures->erase(i);
+    BroadcastChangedState(*member, failures);
+    delete member;
+  }
+}
+
+// Builds a simple list of "name,id\n" entries for each member.
+std::string PeerChannel::BuildResponseForNewMember(const ChannelMember& member,
+                                                   std::string* content_type) {
+  assert(content_type);
+
+  *content_type = "text/plain";
+  // The peer itself will always be the first entry.
+  std::string response(member.GetEntry());
+  for (Members::iterator i = members_.begin(); i != members_.end(); ++i) {
+    if (member.id() != (*i)->id()) {
+      assert((*i)->connected());
+      response += (*i)->GetEntry();
+    }
+  }
+
+  return response;
+}
diff --git a/examples/peerconnection/server/peer_channel.h b/examples/peerconnection/server/peer_channel.h
new file mode 100644
index 0000000..263f17d
--- /dev/null
+++ b/examples/peerconnection/server/peer_channel.h
@@ -0,0 +1,120 @@
+/*
+ *  Copyright 2011 The WebRTC Project Authors. All rights reserved.
+ *
+ *  Use of this source code is governed by a BSD-style license
+ *  that can be found in the LICENSE file in the root of the source
+ *  tree. An additional intellectual property rights grant can be found
+ *  in the file PATENTS.  All contributing project authors may
+ *  be found in the AUTHORS file in the root of the source tree.
+ */
+
+#ifndef TALK_EXAMPLES_PEERCONNECTION_SERVER_PEER_CHANNEL_H_
+#define TALK_EXAMPLES_PEERCONNECTION_SERVER_PEER_CHANNEL_H_
+#pragma once
+
+#include <time.h>
+
+#include <queue>
+#include <string>
+#include <vector>
+
+class DataSocket;
+
+// Represents a single peer connected to the server.
+class ChannelMember {
+ public:
+  explicit ChannelMember(DataSocket* socket);
+  ~ChannelMember();
+
+  bool connected() const { return connected_; }
+  int id() const { return id_; }
+  void set_disconnected() { connected_ = false; }
+  bool is_wait_request(DataSocket* ds) const;
+  const std::string& name() const { return name_; }
+
+  bool TimedOut();
+
+  std::string GetPeerIdHeader() const;
+
+  bool NotifyOfOtherMember(const ChannelMember& other);
+
+  // Returns a string in the form "name,id\n".
+  std::string GetEntry() const;
+
+  void ForwardRequestToPeer(DataSocket* ds, ChannelMember* peer);
+
+  void OnClosing(DataSocket* ds);
+
+  void QueueResponse(const std::string& status, const std::string& content_type,
+                     const std::string& extra_headers, const std::string& data);
+
+  void SetWaitingSocket(DataSocket* ds);
+
+ protected:
+  struct QueuedResponse {
+    std::string status, content_type, extra_headers, data;
+  };
+
+  DataSocket* waiting_socket_;
+  int id_;
+  bool connected_;
+  time_t timestamp_;
+  std::string name_;
+  std::queue<QueuedResponse> queue_;
+  static int s_member_id_;
+};
+
+// Manages all currently connected peers.
+class PeerChannel {
+ public:
+  typedef std::vector<ChannelMember*> Members;
+
+  PeerChannel() {
+  }
+
+  ~PeerChannel() {
+    DeleteAll();
+  }
+
+  const Members& members() const { return members_; }
+
+  // Returns true if the request should be treated as a new ChannelMember
+  // request.  Otherwise the request is not peerconnection related.
+  static bool IsPeerConnection(const DataSocket* ds);
+
+  // Finds a connected peer that's associated with the |ds| socket.
+  ChannelMember* Lookup(DataSocket* ds) const;
+
+  // Checks if the request has a "peer_id" parameter and if so, looks up the
+  // peer for which the request is targeted at.
+  ChannelMember* IsTargetedRequest(const DataSocket* ds) const;
+
+  // Adds a new ChannelMember instance to the list of connected peers and
+  // associates it with the socket.
+  bool AddMember(DataSocket* ds);
+
+  // Closes all connections and sends a "shutting down" message to all
+  // connected peers.
+  void CloseAll();
+
+  // Called when a socket was determined to be closing by the peer (or if the
+  // connection went dead).
+  void OnClosing(DataSocket* ds);
+
+  void CheckForTimeout();
+
+ protected:
+  void DeleteAll();
+  void BroadcastChangedState(const ChannelMember& member,
+                             Members* delivery_failures);
+  void HandleDeliveryFailures(Members* failures);
+
+  // Builds a simple list of "name,id\n" entries for each member.
+  std::string BuildResponseForNewMember(const ChannelMember& member,
+                                        std::string* content_type);
+
+ protected:
+  Members members_;
+};
+
+#endif  // TALK_EXAMPLES_PEERCONNECTION_SERVER_PEER_CHANNEL_H_
diff --git a/examples/peerconnection/server/server_test.html b/examples/peerconnection/server/server_test.html
new file mode 100644
index 0000000..0a165f1
--- /dev/null
+++ b/examples/peerconnection/server/server_test.html
@@ -0,0 +1,237 @@
+<html>
+<head>
+<title>PeerConnection server test page</title>
+
+<script>
+var request = null;
+var hangingGet = null;
+var localName;
+var server;
+var my_id = -1;
+var other_peers = {};
+var message_counter = 0;
+
+function trace(txt) {
+  var elem = document.getElementById("debug");
+  elem.innerHTML += txt + "<br>";
+}
+
+function handleServerNotification(data) {
+  trace("Server notification: " + data);
+  var parsed = data.split(',');
+  if (parseInt(parsed[2]) != 0)
+    other_peers[parseInt(parsed[1])] = parsed[0];
+}
+
+function handlePeerMessage(peer_id, data) {
+  ++message_counter;
+  var str = "Message from '" + other_peers[peer_id] + "'&nbsp;";
+  str += "<span id='toggle_" + message_counter + "' onclick='toggleMe(this);' ";
+  str += "style='cursor: pointer'>+</span><br>";
+  str += "<blockquote id='msg_" + message_counter + "' style='display:none'>";
+  str += data + "</blockquote>";
+  trace(str);
+  if (document.getElementById("loopback").checked) {
+    if (data.search("offer") != -1) {
+      // In loopback mode, if DTLS is enabled, notify the client to disable it.
+      // Otherwise replace the offer with an answer.
+      if (data.search("fingerprint") != -1)
+        data = data.replace("offer", "offer-loopback");
+      else
+        data = data.replace("offer", "answer");
+    }
+    sendToPeer(peer_id, data);
+  }
+}
+
+function GetIntHeader(r, name) {
+  var val = r.getResponseHeader(name);
+  return val != null && val.length ? parseInt(val) : -1;
+}
+
+function hangingGetCallback() {
+  try {
+    if (hangingGet.readyState != 4)
+      return;
+    if (hangingGet.status != 200) {
+      trace("server error: " + hangingGet.statusText);
+      disconnect();
+    } else {
+      var peer_id = GetIntHeader(hangingGet, "Pragma");
+      if (peer_id == my_id) {
+        handleServerNotification(hangingGet.responseText);
+      } else {
+        handlePeerMessage(peer_id, hangingGet.responseText);
+      }
+    }
+
+    if (hangingGet) {
+      hangingGet.abort();
+      hangingGet = null;
+    }
+
+    if (my_id != -1)
+      window.setTimeout(startHangingGet, 0);
+  } catch (e) {
+    trace("Hanging get error: " + e.description);
+  }
+}
+
+function startHangingGet() {
+  try {
+    hangingGet = new XMLHttpRequest();
+    hangingGet.onreadystatechange = hangingGetCallback;
+    hangingGet.ontimeout = onHangingGetTimeout;
+    hangingGet.open("GET", server + "/wait?peer_id=" + my_id, true);
+    hangingGet.send();  
+  } catch (e) {
+    trace("error" + e.description);
+  }
+}
+
+function onHangingGetTimeout() {
+  trace("hanging get timeout. issuing again.");
+  hangingGet.abort();
+  hangingGet = null;
+  if (my_id != -1)
+    window.setTimeout(startHangingGet, 0);
+}
+
+function signInCallback() {
+  try {
+    if (request.readyState == 4) {
+      if (request.status == 200) {
+        var peers = request.responseText.split("\n");
+        my_id = parseInt(peers[0].split(',')[1]);
+        trace("My id: " + my_id);
+        for (var i = 1; i < peers.length; ++i) {
+          if (peers[i].length > 0) {
+            trace("Peer " + i + ": " + peers[i]);
+            var parsed = peers[i].split(',');
+            other_peers[parseInt(parsed[1])] = parsed[0];
+          }
+        }
+        startHangingGet();
+        request = null;
+      }
+    }
+  } catch (e) {
+    trace("error: " + e.description);
+  }
+}
+
+function signIn() {
+  try {
+    request = new XMLHttpRequest();
+    request.onreadystatechange = signInCallback;
+    request.open("GET", server + "/sign_in?" + localName, true);
+    request.send();
+  } catch (e) {
+    trace("error: " + e.description);
+  }
+}
+
+function sendToPeer(peer_id, data) {
+  if (my_id == -1) {
+    alert("Not connected");
+    return;
+  }
+  if (peer_id == my_id) {
+    alert("Can't send a message to oneself :)");
+    return;
+  }
+  var r = new XMLHttpRequest();
+  r.open("POST", server + "/message?peer_id=" + my_id + "&to=" + peer_id,
+         false);
+  r.setRequestHeader("Content-Type", "text/plain");
+  r.send(data);
+  r = null;
+}
+
+function connect() {
+  localName = document.getElementById("local").value.toLowerCase();
+  server = document.getElementById("server").value.toLowerCase();
+  if (localName.length == 0) {
+    alert("I need a name please.");
+    document.getElementById("local").focus();
+  } else {
+    document.getElementById("connect").disabled = true;
+    document.getElementById("disconnect").disabled = false;
+    document.getElementById("send").disabled = false;
+    signIn();
+  }
+}
+
+function disconnect() {
+  if (request) {
+    request.abort();
+    request = null;
+  }
+  
+  if (hangingGet) {
+    hangingGet.abort();
+    hangingGet = null;
+  }
+
+  if (my_id != -1) {
+    request = new XMLHttpRequest();
+    request.open("GET", server + "/sign_out?peer_id=" + my_id, false);
+    request.send();
+    request = null;
+    my_id = -1;
+  }
+
+  document.getElementById("connect").disabled = false;
+  document.getElementById("disconnect").disabled = true;
+  document.getElementById("send").disabled = true;
+}
+
+window.onbeforeunload = disconnect;
+
+function send() {
+  var text = document.getElementById("message").value;
+  var peer_id = parseInt(document.getElementById("peer_id").value);
+  if (!text.length || peer_id == 0) {
+    alert("No text supplied or invalid peer id");
+  } else {
+    sendToPeer(peer_id, text);
+  }
+}
+
+function toggleMe(obj) {
+  var id = obj.id.replace("toggle", "msg");
+  var t = document.getElementById(id);
+  if (obj.innerText == "+") {
+    obj.innerText = "-";
+    t.style.display = "block";
+  } else {
+    obj.innerText = "+";
+    t.style.display = "none";
+  }
+}
+
+</script>
+
+</head>
+<body>
+Server: <input type="text" id="server" value="http://localhost:8888" /><br>
+<input type="checkbox" id="loopback" checked="checked"/> Loopback (just send
+received messages right back)<br>
+Your name: <input type="text" id="local" value="my_name"/>
+<button id="connect" onclick="connect();">Connect</button>
+<button disabled="true" id="disconnect"
+        onclick="disconnect();">Disconnect</button>
+<br>
+<table><tr><td>
+Target peer id: <input type="text" id="peer_id" size="3"/></td><td>
+Message: <input type="text" id="message"/></td><td>
+<button disabled="true" id="send" onclick="send();">Send</button>
+</td></tr></table>
+<button onclick="document.getElementById('debug').innerHTML='';">
+Clear log</button>
+
+<pre id="debug">
+</pre>
+<br><hr>
+</body>
+</html>
diff --git a/examples/peerconnection/server/utils.cc b/examples/peerconnection/server/utils.cc
new file mode 100644
index 0000000..93a6d05
--- /dev/null
+++ b/examples/peerconnection/server/utils.cc
@@ -0,0 +1,25 @@
+/*
+ *  Copyright 2011 The WebRTC Project Authors. All rights reserved.
+ *
+ *  Use of this source code is governed by a BSD-style license
+ *  that can be found in the LICENSE file in the root of the source
+ *  tree. An additional intellectual property rights grant can be found
+ *  in the file PATENTS.  All contributing project authors may
+ *  be found in the AUTHORS file in the root of the source tree.
+ */
+
+#include "webrtc/examples/peerconnection/server/utils.h"
+
+#include <stdio.h>
+
+#include "webrtc/base/stringencode.h"
+
+using rtc::ToString;
+
+std::string int2str(int i) {
+  return ToString<int>(i);
+}
+
+std::string size_t2str(size_t i) {
+  return ToString<size_t>(i);
+}
diff --git a/examples/peerconnection/server/utils.h b/examples/peerconnection/server/utils.h
new file mode 100644
index 0000000..e70968b
--- /dev/null
+++ b/examples/peerconnection/server/utils.h
@@ -0,0 +1,25 @@
+/*
+ *  Copyright 2011 The WebRTC Project Authors. All rights reserved.
+ *
+ *  Use of this source code is governed by a BSD-style license
+ *  that can be found in the LICENSE file in the root of the source
+ *  tree. An additional intellectual property rights grant can be found
+ *  in the file PATENTS.  All contributing project authors may
+ *  be found in the AUTHORS file in the root of the source tree.
+ */
+
+#ifndef TALK_EXAMPLES_PEERCONNECTION_SERVER_UTILS_H_
+#define TALK_EXAMPLES_PEERCONNECTION_SERVER_UTILS_H_
+#pragma once
+
+#include <assert.h>
+#include <string>
+
+#ifndef ARRAYSIZE
+#define ARRAYSIZE(x) (sizeof(x) / sizeof(x[0]))
+#endif
+
+std::string int2str(int i);
+std::string size_t2str(size_t i);
+
+#endif  // TALK_EXAMPLES_PEERCONNECTION_SERVER_UTILS_H_
diff --git a/examples/relayserver/relayserver_main.cc b/examples/relayserver/relayserver_main.cc
new file mode 100644
index 0000000..31f43c4
--- /dev/null
+++ b/examples/relayserver/relayserver_main.cc
@@ -0,0 +1,63 @@
+/*
+ *  Copyright 2004 The WebRTC Project Authors. All rights reserved.
+ *
+ *  Use of this source code is governed by a BSD-style license
+ *  that can be found in the LICENSE file in the root of the source
+ *  tree. An additional intellectual property rights grant can be found
+ *  in the file PATENTS.  All contributing project authors may
+ *  be found in the AUTHORS file in the root of the source tree.
+ */
+
+#include <iostream>  // NOLINT
+
+#include "webrtc/p2p/base/relayserver.h"
+#include "webrtc/base/scoped_ptr.h"
+#include "webrtc/base/thread.h"
+
+int main(int argc, char **argv) {
+  if (argc != 3) {
+    std::cerr << "usage: relayserver internal-address external-address"
+              << std::endl;
+    return 1;
+  }
+
+  rtc::SocketAddress int_addr;
+  if (!int_addr.FromString(argv[1])) {
+    std::cerr << "Unable to parse IP address: " << argv[1];
+    return 1;
+  }
+
+  rtc::SocketAddress ext_addr;
+  if (!ext_addr.FromString(argv[2])) {
+    std::cerr << "Unable to parse IP address: " << argv[2];
+    return 1;
+  }
+
+  rtc::Thread *pthMain = rtc::Thread::Current();
+
+  rtc::scoped_ptr<rtc::AsyncUDPSocket> int_socket(
+      rtc::AsyncUDPSocket::Create(pthMain->socketserver(), int_addr));
+  if (!int_socket) {
+    std::cerr << "Failed to create a UDP socket bound at"
+              << int_addr.ToString() << std::endl;
+    return 1;
+  }
+
+  rtc::scoped_ptr<rtc::AsyncUDPSocket> ext_socket(
+      rtc::AsyncUDPSocket::Create(pthMain->socketserver(), ext_addr));
+  if (!ext_socket) {
+    std::cerr << "Failed to create a UDP socket bound at"
+              << ext_addr.ToString() << std::endl;
+    return 1;
+  }
+
+  cricket::RelayServer server(pthMain);
+  server.AddInternalSocket(int_socket.get());
+  server.AddExternalSocket(ext_socket.get());
+
+  std::cout << "Listening internally at " << int_addr.ToString() << std::endl;
+  std::cout << "Listening externally at " << ext_addr.ToString() << std::endl;
+
+  pthMain->Run();
+  return 0;
+}
diff --git a/examples/stunserver/stunserver_main.cc b/examples/stunserver/stunserver_main.cc
new file mode 100644
index 0000000..9cbd615
--- /dev/null
+++ b/examples/stunserver/stunserver_main.cc
@@ -0,0 +1,51 @@
+/*
+ *  Copyright 2004 The WebRTC Project Authors. All rights reserved.
+ *
+ *  Use of this source code is governed by a BSD-style license
+ *  that can be found in the LICENSE file in the root of the source
+ *  tree. An additional intellectual property rights grant can be found
+ *  in the file PATENTS.  All contributing project authors may
+ *  be found in the AUTHORS file in the root of the source tree.
+ */
+
+#if defined(WEBRTC_POSIX)
+#include <errno.h>
+#endif  // WEBRTC_POSIX
+
+#include <iostream>
+
+#include "webrtc/p2p/base/stunserver.h"
+#include "webrtc/base/thread.h"
+
+using namespace cricket;
+
+int main(int argc, char* argv[]) {
+  if (argc != 2) {
+    std::cerr << "usage: stunserver address" << std::endl;
+    return 1;
+  }
+
+  rtc::SocketAddress server_addr;
+  if (!server_addr.FromString(argv[1])) {
+    std::cerr << "Unable to parse IP address: " << argv[1];
+    return 1;
+  }
+
+  rtc::Thread *pthMain = rtc::Thread::Current();
+
+  rtc::AsyncUDPSocket* server_socket =
+      rtc::AsyncUDPSocket::Create(pthMain->socketserver(), server_addr);
+  if (!server_socket) {
+    std::cerr << "Failed to create a UDP socket" << std::endl;
+    return 1;
+  }
+
+  StunServer* server = new StunServer(server_socket);
+
+  std::cout << "Listening at " << server_addr.ToString() << std::endl;
+
+  pthMain->Run();
+
+  delete server;
+  return 0;
+}
diff --git a/examples/turnserver/turnserver_main.cc b/examples/turnserver/turnserver_main.cc
new file mode 100644
index 0000000..e7b464f
--- /dev/null
+++ b/examples/turnserver/turnserver_main.cc
@@ -0,0 +1,85 @@
+/*
+ *  Copyright 2012 The WebRTC Project Authors. All rights reserved.
+ *
+ *  Use of this source code is governed by a BSD-style license
+ *  that can be found in the LICENSE file in the root of the source
+ *  tree. An additional intellectual property rights grant can be found
+ *  in the file PATENTS.  All contributing project authors may
+ *  be found in the AUTHORS file in the root of the source tree.
+ */
+
+#include <iostream>  // NOLINT
+
+#include "webrtc/p2p/base/basicpacketsocketfactory.h"
+#include "webrtc/p2p/base/turnserver.h"
+#include "webrtc/base/asyncudpsocket.h"
+#include "webrtc/base/optionsfile.h"
+#include "webrtc/base/stringencode.h"
+#include "webrtc/base/thread.h"
+
+static const char kSoftware[] = "libjingle TurnServer";
+
+class TurnFileAuth : public cricket::TurnAuthInterface {
+ public:
+  explicit TurnFileAuth(const std::string& path) : file_(path) {
+    file_.Load();
+  }
+  virtual bool GetKey(const std::string& username, const std::string& realm,
+                      std::string* key) {
+    // File is stored as lines of <username>=<HA1>.
+    // Generate HA1 via "echo -n "<username>:<realm>:<password>" | md5sum"
+    std::string hex;
+    bool ret = file_.GetStringValue(username, &hex);
+    if (ret) {
+      char buf[32];
+      size_t len = rtc::hex_decode(buf, sizeof(buf), hex);
+      *key = std::string(buf, len);
+    }
+    return ret;
+  }
+ private:
+  rtc::OptionsFile file_;
+};
+
+int main(int argc, char **argv) {
+  if (argc != 5) {
+    std::cerr << "usage: turnserver int-addr ext-ip realm auth-file"
+              << std::endl;
+    return 1;
+  }
+
+  rtc::SocketAddress int_addr;
+  if (!int_addr.FromString(argv[1])) {
+    std::cerr << "Unable to parse IP address: " << argv[1] << std::endl;
+    return 1;
+  }
+
+  rtc::IPAddress ext_addr;
+  if (!IPFromString(argv[2], &ext_addr)) {
+    std::cerr << "Unable to parse IP address: " << argv[2] << std::endl;
+    return 1;
+  }
+
+  rtc::Thread* main = rtc::Thread::Current();
+  rtc::AsyncUDPSocket* int_socket =
+      rtc::AsyncUDPSocket::Create(main->socketserver(), int_addr);
+  if (!int_socket) {
+    std::cerr << "Failed to create a UDP socket bound at"
+              << int_addr.ToString() << std::endl;
+    return 1;
+  }
+
+  cricket::TurnServer server(main);
+  TurnFileAuth auth(argv[4]);
+  server.set_realm(argv[3]);
+  server.set_software(kSoftware);
+  server.set_auth_hook(&auth);
+  server.AddInternalSocket(int_socket, cricket::PROTO_UDP);
+  server.SetExternalSocketFactory(new rtc::BasicPacketSocketFactory(),
+                                  rtc::SocketAddress(ext_addr, 0));
+
+  std::cout << "Listening internally at " << int_addr.ToString() << std::endl;
+
+  main->Run();
+  return 0;
+}
diff --git a/libjingle_examples.gyp b/libjingle_examples.gyp
new file mode 100755
index 0000000..135932d
--- /dev/null
+++ b/libjingle_examples.gyp
@@ -0,0 +1,396 @@
+#
+# Copyright 2012 The WebRTC Project Authors. All rights reserved.
+#
+# Use of this source code is governed by a BSD-style license
+# that can be found in the LICENSE file in the root of the source
+# tree. An additional intellectual property rights grant can be found
+# in the file PATENTS.  All contributing project authors may
+# be found in the AUTHORS file in the root of the source tree.
+
+{
+  'includes': [
+    '../talk/build/common.gypi',
+  ],
+  'targets': [
+    {
+      'target_name': 'relayserver',
+      'type': 'executable',
+      'dependencies': [
+        '../talk/libjingle.gyp:libjingle',
+        '../talk/libjingle.gyp:libjingle_p2p',
+      ],
+      'sources': [
+        'examples/relayserver/relayserver_main.cc',
+      ],
+    },  # target relayserver
+    {
+      'target_name': 'stunserver',
+      'type': 'executable',
+      'dependencies': [
+        '../talk/libjingle.gyp:libjingle',
+        '../talk/libjingle.gyp:libjingle_p2p',
+      ],
+      'sources': [
+        'examples/stunserver/stunserver_main.cc',
+      ],
+    },  # target stunserver
+    {
+      'target_name': 'turnserver',
+      'type': 'executable',
+      'dependencies': [
+        '../talk/libjingle.gyp:libjingle',
+        '../talk/libjingle.gyp:libjingle_p2p',
+      ],
+      'sources': [
+        'examples/turnserver/turnserver_main.cc',
+      ],
+    },  # target turnserver
+    {
+      'target_name': 'peerconnection_server',
+      'type': 'executable',
+      'sources': [
+        'examples/peerconnection/server/data_socket.cc',
+        'examples/peerconnection/server/data_socket.h',
+        'examples/peerconnection/server/main.cc',
+        'examples/peerconnection/server/peer_channel.cc',
+        'examples/peerconnection/server/peer_channel.h',
+        'examples/peerconnection/server/utils.cc',
+        'examples/peerconnection/server/utils.h',
+      ],
+      'dependencies': [
+        '<(webrtc_root)/common.gyp:webrtc_common',
+        '../talk/libjingle.gyp:libjingle',
+      ],
+      # TODO(ronghuawu): crbug.com/167187 fix size_t to int truncations.
+      'msvs_disabled_warnings': [ 4309, ],
+    }, # target peerconnection_server
+  ],
+  'conditions': [
+    ['OS=="linux" or OS=="win"', {
+      'targets': [
+        {
+          'target_name': 'peerconnection_client',
+          'type': 'executable',
+          'sources': [
+            'examples/peerconnection/client/conductor.cc',
+            'examples/peerconnection/client/conductor.h',
+            'examples/peerconnection/client/defaults.cc',
+            'examples/peerconnection/client/defaults.h',
+            'examples/peerconnection/client/peer_connection_client.cc',
+            'examples/peerconnection/client/peer_connection_client.h',
+          ],
+          'dependencies': [
+            '../talk/libjingle.gyp:libjingle_peerconnection',
+            '<@(libjingle_tests_additional_deps)',
+          ],
+          'conditions': [
+            ['build_json==1', {
+              'dependencies': [
+                '<(DEPTH)/third_party/jsoncpp/jsoncpp.gyp:jsoncpp',
+              ],
+            }],
+            # TODO(ronghuawu): Move these files to a win/ directory then they
+            # can be excluded automatically.
+            ['OS=="win"', {
+              'sources': [
+                'examples/peerconnection/client/flagdefs.h',
+                'examples/peerconnection/client/main.cc',
+                'examples/peerconnection/client/main_wnd.cc',
+                'examples/peerconnection/client/main_wnd.h',
+              ],
+              'msvs_settings': {
+                'VCLinkerTool': {
+                 'SubSystem': '2',  # Windows
+                },
+              },
+            }],  # OS=="win"
+            ['OS=="linux"', {
+              'sources': [
+                'examples/peerconnection/client/linux/main.cc',
+                'examples/peerconnection/client/linux/main_wnd.cc',
+                'examples/peerconnection/client/linux/main_wnd.h',
+              ],
+              'cflags': [
+                '<!@(pkg-config --cflags glib-2.0 gobject-2.0 gtk+-2.0)',
+              ],
+              'link_settings': {
+                'ldflags': [
+                  '<!@(pkg-config --libs-only-L --libs-only-other glib-2.0'
+                      ' gobject-2.0 gthread-2.0 gtk+-2.0)',
+                ],
+                'libraries': [
+                  '<!@(pkg-config --libs-only-l glib-2.0 gobject-2.0'
+                      ' gthread-2.0 gtk+-2.0)',
+                  '-lX11',
+                  '-lXcomposite',
+                  '-lXext',
+                  '-lXrender',
+                ],
+              },
+            }],  # OS=="linux"
+          ],  # conditions
+        },  # target peerconnection_client
+      ], # targets
+    }],  # OS=="linux" or OS=="win"
+
+    ['OS=="ios" or (OS=="mac" and target_arch!="ia32" and mac_sdk>="10.8")', {
+      'targets': [
+        {
+          'target_name': 'apprtc_common',
+          'type': 'static_library',
+          'dependencies': [
+            '../talk/libjingle.gyp:libjingle_peerconnection_objc',
+          ],
+          'sources': [
+            'examples/objc/AppRTCDemo/common/ARDUtilities.h',
+            'examples/objc/AppRTCDemo/common/ARDUtilities.m',
+          ],
+          'include_dirs': [
+            'examples/objc/AppRTCDemo/common',
+          ],
+          'direct_dependent_settings': {
+            'include_dirs': [
+              'examples/objc/AppRTCDemo/common',
+            ],
+          },
+          'conditions': [
+            ['OS=="mac"', {
+              'xcode_settings': {
+                'MACOSX_DEPLOYMENT_TARGET' : '10.8',
+              },
+            }],
+          ],
+        },
+        {
+          'target_name': 'apprtc_signaling',
+          'type': 'static_library',
+          'dependencies': [
+            'apprtc_common',
+            '../talk/libjingle.gyp:libjingle_peerconnection_objc',
+            'socketrocket',
+          ],
+          'sources': [
+            'examples/objc/AppRTCDemo/ARDAppClient.h',
+            'examples/objc/AppRTCDemo/ARDAppClient.m',
+            'examples/objc/AppRTCDemo/ARDAppClient+Internal.h',
+            'examples/objc/AppRTCDemo/ARDAppEngineClient.h',
+            'examples/objc/AppRTCDemo/ARDAppEngineClient.m',
+            'examples/objc/AppRTCDemo/ARDCEODTURNClient.h',
+            'examples/objc/AppRTCDemo/ARDCEODTURNClient.m',
+            'examples/objc/AppRTCDemo/ARDJoinResponse.h',
+            'examples/objc/AppRTCDemo/ARDJoinResponse.m',
+            'examples/objc/AppRTCDemo/ARDJoinResponse+Internal.h',
+            'examples/objc/AppRTCDemo/ARDMessageResponse.h',
+            'examples/objc/AppRTCDemo/ARDMessageResponse.m',
+            'examples/objc/AppRTCDemo/ARDMessageResponse+Internal.h',
+            'examples/objc/AppRTCDemo/ARDRoomServerClient.h',
+            'examples/objc/AppRTCDemo/ARDSDPUtils.h',
+            'examples/objc/AppRTCDemo/ARDSDPUtils.m',
+            'examples/objc/AppRTCDemo/ARDSignalingChannel.h',
+            'examples/objc/AppRTCDemo/ARDSignalingMessage.h',
+            'examples/objc/AppRTCDemo/ARDSignalingMessage.m',
+            'examples/objc/AppRTCDemo/ARDTURNClient.h',
+            'examples/objc/AppRTCDemo/ARDWebSocketChannel.h',
+            'examples/objc/AppRTCDemo/ARDWebSocketChannel.m',
+            'examples/objc/AppRTCDemo/RTCICECandidate+JSON.h',
+            'examples/objc/AppRTCDemo/RTCICECandidate+JSON.m',
+            'examples/objc/AppRTCDemo/RTCICEServer+JSON.h',
+            'examples/objc/AppRTCDemo/RTCICEServer+JSON.m',
+            'examples/objc/AppRTCDemo/RTCMediaConstraints+JSON.h',
+            'examples/objc/AppRTCDemo/RTCMediaConstraints+JSON.m',
+            'examples/objc/AppRTCDemo/RTCSessionDescription+JSON.h',
+            'examples/objc/AppRTCDemo/RTCSessionDescription+JSON.m',
+          ],
+          'include_dirs': [
+            'examples/objc/AppRTCDemo',
+          ],
+          'direct_dependent_settings': {
+            'include_dirs': [
+              'examples/objc/AppRTCDemo',
+            ],
+          },
+          'export_dependent_settings': [
+            '../talk/libjingle.gyp:libjingle_peerconnection_objc',
+          ],
+          'conditions': [
+            ['OS=="mac"', {
+              'xcode_settings': {
+                'MACOSX_DEPLOYMENT_TARGET' : '10.8',
+              },
+            }],
+          ],
+        },
+        {
+          'target_name': 'AppRTCDemo',
+          'type': 'executable',
+          'product_name': 'AppRTCDemo',
+          'mac_bundle': 1,
+          'dependencies': [
+            'apprtc_common',
+            'apprtc_signaling',
+          ],
+          'conditions': [
+            ['OS=="ios"', {
+              'mac_bundle_resources': [
+                'examples/objc/AppRTCDemo/ios/resources/iPhone5@2x.png',
+                'examples/objc/AppRTCDemo/ios/resources/iPhone6@2x.png',
+                'examples/objc/AppRTCDemo/ios/resources/iPhone6p@3x.png',
+                'examples/objc/AppRTCDemo/ios/resources/Roboto-Regular.ttf',
+                'examples/objc/AppRTCDemo/ios/resources/ic_call_end_black_24dp.png',
+                'examples/objc/AppRTCDemo/ios/resources/ic_call_end_black_24dp@2x.png',
+                'examples/objc/AppRTCDemo/ios/resources/ic_clear_black_24dp.png',
+                'examples/objc/AppRTCDemo/ios/resources/ic_clear_black_24dp@2x.png',
+                'examples/objc/AppRTCDemo/ios/resources/ic_switch_video_black_24dp.png',
+                'examples/objc/AppRTCDemo/ios/resources/ic_switch_video_black_24dp@2x.png',
+                'examples/objc/Icon.png',
+              ],
+              'sources': [
+                'examples/objc/AppRTCDemo/ios/ARDAppDelegate.h',
+                'examples/objc/AppRTCDemo/ios/ARDAppDelegate.m',
+                'examples/objc/AppRTCDemo/ios/ARDMainView.h',
+                'examples/objc/AppRTCDemo/ios/ARDMainView.m',
+                'examples/objc/AppRTCDemo/ios/ARDMainViewController.h',
+                'examples/objc/AppRTCDemo/ios/ARDMainViewController.m',
+                'examples/objc/AppRTCDemo/ios/ARDVideoCallView.h',
+                'examples/objc/AppRTCDemo/ios/ARDVideoCallView.m',
+                'examples/objc/AppRTCDemo/ios/ARDVideoCallViewController.h',
+                'examples/objc/AppRTCDemo/ios/ARDVideoCallViewController.m',
+                'examples/objc/AppRTCDemo/ios/AppRTCDemo-Prefix.pch',
+                'examples/objc/AppRTCDemo/ios/UIImage+ARDUtilities.h',
+                'examples/objc/AppRTCDemo/ios/UIImage+ARDUtilities.m',
+                'examples/objc/AppRTCDemo/ios/main.m',
+              ],
+              'xcode_settings': {
+                'INFOPLIST_FILE': 'examples/objc/AppRTCDemo/ios/Info.plist',
+              },
+            }],
+            ['OS=="mac"', {
+              'sources': [
+                'examples/objc/AppRTCDemo/mac/APPRTCAppDelegate.h',
+                'examples/objc/AppRTCDemo/mac/APPRTCAppDelegate.m',
+                'examples/objc/AppRTCDemo/mac/APPRTCViewController.h',
+                'examples/objc/AppRTCDemo/mac/APPRTCViewController.m',
+                'examples/objc/AppRTCDemo/mac/main.m',
+              ],
+              'xcode_settings': {
+                'CLANG_WARN_OBJC_MISSING_PROPERTY_SYNTHESIS': 'NO',
+                'INFOPLIST_FILE': 'examples/objc/AppRTCDemo/mac/Info.plist',
+                'MACOSX_DEPLOYMENT_TARGET' : '10.8',
+                'OTHER_LDFLAGS': [
+                  '-framework AVFoundation',
+                ],
+              },
+            }],
+            ['target_arch=="ia32"', {
+              'dependencies' : [
+                '<(DEPTH)/testing/iossim/iossim.gyp:iossim#host',
+              ],
+            }],
+          ],
+        },  # target AppRTCDemo
+        {
+          # TODO(tkchin): move this into the real third party location and
+          # have it mirrored on chrome infra.
+          'target_name': 'socketrocket',
+          'type': 'static_library',
+          'sources': [
+            'examples/objc/AppRTCDemo/third_party/SocketRocket/SRWebSocket.h',
+            'examples/objc/AppRTCDemo/third_party/SocketRocket/SRWebSocket.m',
+          ],
+          'conditions': [
+            ['OS=="mac"', {
+              'xcode_settings': {
+                # SocketRocket autosynthesizes some properties. Disable the
+                # warning so we can compile successfully.
+                'CLANG_WARN_OBJC_MISSING_PROPERTY_SYNTHESIS': 'NO',
+                'MACOSX_DEPLOYMENT_TARGET' : '10.8',
+                # SRWebSocket.m uses code with partial availability.
+                # https://code.google.com/p/webrtc/issues/detail?id=4695
+                'WARNING_CFLAGS!': ['-Wpartial-availability'],
+              },
+            }],
+          ],
+          'direct_dependent_settings': {
+            'include_dirs': [
+              'examples/objc/AppRTCDemo/third_party/SocketRocket',
+            ],
+          },
+          'xcode_settings': {
+            'CLANG_ENABLE_OBJC_ARC': 'YES',
+            'WARNING_CFLAGS': [
+              '-Wno-deprecated-declarations',
+            ],
+          },
+          'link_settings': {
+            'xcode_settings': {
+              'OTHER_LDFLAGS': [
+                '-framework CFNetwork',
+              ],
+            },
+            'libraries': [
+              '$(SDKROOT)/usr/lib/libicucore.dylib',
+            ],
+          }
+        },  # target socketrocket
+      ],  # targets
+    }],  # OS=="ios" or (OS=="mac" and target_arch!="ia32" and mac_sdk>="10.8")
+
+    ['OS=="android"', {
+      'targets': [
+        {
+          'target_name': 'AppRTCDemo',
+          'type': 'none',
+          'dependencies': [
+            '../talk/libjingle.gyp:libjingle_peerconnection_java',
+          ],
+          'variables': {
+            'apk_name': 'AppRTCDemo',
+            'java_in_dir': 'examples/androidapp',
+            'has_java_resources': 1,
+            'resource_dir': 'examples/androidapp/res',
+            'R_package': 'org.appspot.apprtc',
+            'R_package_relpath': 'org/appspot/apprtc',
+            'input_jars_paths': [
+              'examples/androidapp/third_party/autobanh/autobanh.jar',
+             ],
+            'library_dexed_jars_paths': [
+              'examples/androidapp/third_party/autobanh/autobanh.jar',
+             ],
+            'native_lib_target': 'libjingle_peerconnection_so',
+            'add_to_dependents_classpaths':1,
+          },
+          'includes': [ '../build/java_apk.gypi' ],
+        },  # target AppRTCDemo
+
+        {
+          # AppRTCDemo creates a .jar as a side effect. Any java targets
+          # that need that .jar in their classpath should depend on this target,
+          # AppRTCDemo_apk. Dependents of AppRTCDemo_apk receive its
+          # jar path in the variable 'apk_output_jar_path'.
+          # This target should only be used by targets which instrument
+          # AppRTCDemo_apk.
+          'target_name': 'AppRTCDemo_apk',
+          'type': 'none',
+          'dependencies': [
+             'AppRTCDemo',
+           ],
+           'includes': [ '../build/apk_fake_jar.gypi' ],
+        },  # target AppRTCDemo_apk
+
+        {
+          'target_name': 'AppRTCDemoTest',
+          'type': 'none',
+          'dependencies': [
+            'AppRTCDemo_apk',
+           ],
+          'variables': {
+            'apk_name': 'AppRTCDemoTest',
+            'java_in_dir': 'examples/androidtests',
+            'is_test_apk': 1,
+          },
+          'includes': [ '../build/java_apk.gypi' ],
+        },
+      ],  # targets
+    }],  # OS=="android"
+  ],
+}