Friday, 6 November 2015

The Firebase CLI: now with database commands

David East
David East
Developer Advocate

With the latest Firebase CLI release, it can do much more than just hosting. The latest release gives you the power to read and write data from your Firebase database.

These new data commands simplify tasks like seeding, exporting, and even transferring data from one Firebase database to another. Now you can spend less time writing data seeding scripts and more time developing your app.

This article will cover a few tricks to do some common data operation tasks.

Data seeding

To save data from the command-line use the data:set command.


firebase data:set /messages messages.json -f my-firebase-db

The first argument is the path you want to write the data to and the second is the JSON file to read from. The last argument is the Firebase to execute the operation against. If you use a / it will save to the root and overwrite existing data. This is perfect for seeding your database.


firebase data:set / seed.json -f my-firebase-db

When your database loses its test data or things get out of whack, you can reseed your database with a simple JSON file.

You'll get asked if you really want to overwrite the data in your database. To skip this message and move with confidence you can provide the -y flag.


firebase data:set / seed.json -f my-firebase-db -y

-f is for Firebase

Most developers have a healthy fear of flags labeled -f. But with Firebase, -f is your friend.

The -f flag enables you to specify which Firebase database you’re running the data command against.


firebase data:set /status status.json -f other-firebase-db

This command saves the contents of status.json into the status path of the other-firebase-db Firebase database.

The -f flag opens up a larger set of possibilities, like transferring data to another Firebase database.

Data exports

Reading data from the CLI is also a simple one-liner.


firebase data:get /messages -f my-firebase-db

The data:get command works just like the data:set command. You can also provide a JSON file to store the data locally.


firebase data:get /messages > messages.json -f my-firebase-db

If your Firebase database is under 256mb, this is a great way to create a seed to work from.


firebase data:get / > seed.json -f my-firebase-db

Formatting JSON

You'll notice that your JSON comes back unformatted which isn't the best for human eyes. The data:get command allows you to pipe the result to another command like Python’s json.tool (which comes installed on standard OSX instances).


firebase data:get /messages -f my-firebase-db | python -m json.tool

The | is the symbol for piping the output to another source. In this case the json.tool module consumes the piped results from the data:get command.

If you’re looking for something even more readable, try using the npm module prettyjson. This module formats JSON into a colored YAML format that’s easy to read from the command-line.


davideast:
age: 27
happy: true
name: David East
erlichbachman:
age: 34
happy: false
name: Erlich Bachman

The prettyjson module isn't bundled with the CLI so you'll have to install it on your machine. But if you prefer another formatting module, the CLI will pipe the results there too.


npm install -g prettyjson

Data transfers

Transferring data from one Firebase database to another is again a simple one-liner.


firebase data:get / -f my-firebase-db | firebase data:set / -f another-firebase -y

This command is especially helpful if you need to move data to another environment. See our previous blog post for more tips on managing multiple environments with the CLI.

Default Firebase database

If the project directory contains a firebase.json file, the commands will default to the Firebase database in the JSON file. To create a firebase.json file just use the command:


firebase init

Now when you run commands within that directory you can omit the -f flag.


firebase data:get /messages messages.json

bash functions

If you find yourself repeating a set of commands it’s probably time to make a bash function. Save your function to your .bash_profile and you’ll be able to access them from anywhere in your console.

If you commonly transfer data between Firebase databases the function below makes it simple.


function transfer_to() {
local master_db="${1}"
local dest_db="${2}"
local path="${3:-/}"
firebase data:get "$path" -f "$master_db" | firebase data:set "$path" -f "$dest_db" -y
}

To use the function just call the transfer_to command with the destination Firebase database.


transfer_to dev-firebase-db staging-firebase-db

Another useful function is for formatting data.


function formatted() {
local db="${1}"
local path="${2:-/}"
firebase data:get "$path" -f "$db" | python -m json.tool
}

The formatted function takes in a Firebase database and the specified path. The outputted JSON is piped to Python’s json.tool.


formatted my-firebase-db /users

You can check out the community project firebase-dot-files on GitHub to contribute your tricks.

The Firebase CLI is more than just hosting. How do you seed your Firebase database? Drop us a comment and let us know if you’re using the new CLI features in your workflow.

Share:

Tuesday, 3 November 2015

Twitter Heart Button CSS3 Animation

Today twitter has been introduced a new heart (like) button, it is actually a replacement for favorite button. I love the way twitter has implemented cool animation effect for click action. This post will explain you, how to implement this using CSS3 and Jquery. Read my previous post Facebook Like System with Jquery, MySQL and PHP, you will understand more about Like System database design and ajax functionality. Take a quick look at the live demo and click the hearts.


Read more »
Share:

Monday, 2 November 2015

Deploying a mutli-tier MEAN application on Microsoft Azure

In this post I am going to connect all the dots I have given in my previous 3 post.
In brief, I will restructure and reconfigure my previous application I had created in my post mentioned in #2 above (Simple CRUD operation using MEAN Stack), then deploy the application on Microsoft Azure with continuous deployment configured with Github and finally I am going to connect all the different layers using the concept I have covered in NodeJS connectivity to MongoDB on Cloud.
To start with this example I would recommend you to go through my previous posts mentioned above, this will give you a little background on what I am going to explain in this post.
Step 1: Restructure the application, in my previous example “Simple CRUD operation using MEAN stack” I have provided an application build on MEAN stack. I am going to take the sample application and split this into 2 parts one is for Client having Angular JS as client side technology and another into Server build on NodeJS + Express technology. And finally this is going to connect to MongoDB which I have used mongolab as DBaaS (Database as a Service). So the overall architecture of the application will be as follows
image
So my application structure will be as follows.
Client and Server apps AddressBookClient AddressBookServer
image image image

Alternately you can also signup free for cloud based MongoDB, provided by MongoDB Atlas using this URL: https://www.mongodb.com/cloud/atlas

Step 2: Configure the Client and Server applications on Github, I have provided more details on application configuration on Github @ Continuous deployment of Node, Express application on Azure using Github. I am following similar steps here except the fact that instead of one consolidated application I am deploying two application i.e one for server and another for client.
Step 3: Configure the Client application on Microsoft Azure having Github as source control with continuous deployment
image
image
Step 4: Configure the Server application on Microsoft Azure having source control on Github with continuous deployment
image
image
Step 5: Create a new app settings key in Microsoft Azure console, and follow the steps
WEB APPS—>Your Application—>Configure—>Go to App Settings—>Add a new Key and Value as mentioned below.
image
Step 6: Configure your nodejs application with this key. Using this key my nodejs server application will pull the DB connection string from the environment variable configured on Azure.

Step 7: Once your application is deployed on Azure, you can test your API by hitting the Url of NodeJS Express API, in my case it is : addressbookserver.azurewebsites.net/persons
image
This gave me the JSON Response from the mongolab  data store. In case if you don’t have any sample data on mongolab, you can login into your mongolab console and add few data with the schema similar to the one I have provided in the screen above. This is required only for testing purpose. From next time onwards my client application is going to GET/PUT/POST and DELETE these data.
Step 8: So up till here DB Server, Express API and Angular Client everything deployed on Azure, so ideally when I will hit the client URL it should call the Express API and using the mondodb driver the Express server should fetch the data from MongoDB hosted on mongolab server. Let’s go ahead and test my application end to end, well even though I have data in MongoDB but I am not getting any output on my client, instead if you open the console you can see I am getting error from my Express Server.
image

XMLHttpRequest cannot load http://addressbookserver.azurewebsites.net/persons. No 'Access-Control-Allow-Origin' header is present on the requested resource. Origin 'http://addressbookclient.azurewebsites.net' is therefore not allowed access.
This is due to the security feature which is provided by default by JavaScript, known as CORS (Cross-Origin Resource Sharing). This prevents JavaScript from making requests across domain boundaries, and has spawned various hacks for making cross-domain requests. CORS introduces a standard mechanism that can be used by all browsers for implementing cross-domain requests. The spec defines a set of headers that allow the browser and server to communicate about which requests are (and are not) allowed.
Since in this example I have broken down my application into client and server and hosted both of them on 2 different domains, which in turn triggered this extra layer of validation. CORS is a very vast topic in itself, so instead of diverting from actual topic I would recommend you to go to the dedicated site for CORS and read about it, I believe this might give you a very good understanding of the internals of CORS.
So coming back to the original article my next step will be to configure my application for CORS.
Step 9: CORS (Cross-Origin Resource Sharing)

In the code above I am adding my client URL to Access-Control-Allow-Origin fo the server.js file in Express API. With this all the response server by the Express API will validate the requestor before it sends any further response back to the client. In this way the default security feature provided by Javascript will make sure that no one else other than my client domain: http://addressbookclient.azurewebsites.net will be able to access my API. You can refer the expressjs implementation of CORS @ http://enable-cors.org/server_expressjs.html
Step 10: Now that we have everything in place let try refreshing the client URL: http://addressbookclient.azurewebsites.net
So now I am able to get the data from mongodb through Express API running on Nodejs.
image
image
And if I see the response, I can see the Access-Control-Allow-Origin is set for my client domain, which means if any other URL will try to access this they will get a similar error which I received in Step 8 above. But if you don’t want any constraints or restrictions then you can change this to Access-Control-Allow-Origin: ‘*’ here I have used wildcard asterisk to allow any domain to access my API.
image
And this is pretty much I had to share on this topic, but to get complete picture on all the associated topics, don’t forget to visit my previous posts:
  1. Continuous deployment of Node, Express application on Azure using Github
  2. Simple CRUD operation using MEAN stack
  3. NodeJS connectivity to MongoDB on Cloud - Microsoft Azure and mongolab
You can fork or clone the code @
References:
Share:

Monday, 19 October 2015

NodeJS connectivity to MongoDB on Cloud - Microsoft Azure and mongolab

In this post I am going to cover different options where you can host your MongoDB database using different options such as  DBaaS (Database as a Service), Paas (Platform as a Service) and IaaS (Infrastructure as a Service) using Microsoft Azure.

image

Option 1 : Using Infrastructure as a Service (IaaS)

In the first option I am using Microsoft Azure dedicated VM to configure MongoDB. This means that you can have your database in your favorite cloud in the same location as your application tier.

Step 1: Configure you mongo DB database on Windows Server 2008 on the VM hosted on cloud (Microsoft Azure). I found an excellent article which demonstrate how to configure your mongodb on Windows Server 2008 hosted on Azure. Instead of reiterating the same thing here does not make any sense, so I am providing you the link to the article. You can use this article to configure your mongodb database server.

https://azure.microsoft.com/en-us/documentation/articles/virtual-machines-install-mongodb-windows-server/

Option 2: Using Platform as a Service (PaaS)

For this option I am using mongolab via Microsoft Azure console or alternately you can directly go to https://mongolab.com and create your free account or paid account.

I am starting from Step 0, just to match my Step 1 till steps 3 with the steps provided by Microsoft Azure wizard and to avoid any confusion.

Step 0: Once you are logged in into Microsoft Azure console, select Market place from the list of available options.

image

Step 1: Select MongoLab as a developer service and select Next arrow

image

Step 2: Select your plan and click next arrow

image

Step 3: Click on Purchase

Step 4: Go to your MongoLab dashboard to manage you MongoLab service, this link will take you to https://mongolab.com/image

Step 5: Once you are in mongolab Dasbboard, you have full control of your mongodb database, collections, documents, users, profile, etc

image

image

Test connectivity using command prompt in Windows OS

Step 1: Before you connect to mongolab database, you need to create users for your database, you have options to create a user as readonly or users with full access.

image

Step 2: To test connectivity you can use command prompt and enter the following command

$ mongo ds048368.mongolab.com:48368/MongoLab-3 -u <dbuser> -p <dbpassword>

This will take you to the mongodb console, you can use mongodb commands/queries to access the documents and collections.image

Same will be reflected in the web console, or you have option where you can directly edit your databases, collections, users and documents.image

Connecting to mongolab using nodejs

Step 1: install mongodb drivers on node using the command

$ npm install mongodb

Step 2: Create a server.js file and add the following code.

Step 3: Run the application to test connectivity to mongolab.image

Now you are all set to start your development using MongoDB hosted on cloud platform. Since I am using these technologies for my learning purpose so personally I felt using the mongolab as most convenient option, as this will not have lots of configuration before you actually start your development and you don’t have to run those extra commands to make sure your server is running on the console.

Even though mongolab and Windows Azure VM is hosting mongodb in both the option, but the option have entirely different infrastructure and concepts. For learning purpose you may choose any of this option, but if you are seriously thinking to host your mongodb for your enterprise applications then you might have to do a detailed study on which one suits you better in terms of IaaS or PaaS.

Mongolab also has a partnerships with the cloud provider to offer both Infrastructure as a Service and Platform as a Service. You can learn more @ https://mongolab.com/company/partners/

References and other helpful links

  1. https://docs.mongodb.org/manual/
  2. https://mongodb.github.io/node-mongodb-native/api-articles/nodekoarticle1.html
Share:

Saturday, 17 October 2015

Simple CRUD operation using MEAN stack

In this post I am going to cover a very basic CRUD operations using MEAN (MongoDB, Express, Angular and NodeJS) stack. You can use this example to kick start your project using MEAN stack.

Meanstack-624x250.jpg

Before I begin, I am assuming that the reader of this post has basic understanding of AngularJS, MongoDB, NodeJS, jQuery and other client side technologies like jQuery, HTML 5, CSS 3, etc.

Most of the commands, environment I have used is for Windows operating system, but there’s not much difference if you are using MacOS or Linux operating system, and moreover all the technologies used in this example are platform independent.

1. Setup the environment

Follow the URL in their respective sites to get detailed information on download and installation instructions. You can get the installers of all the OS which is supported by these framework and their installation instructions in these links.

2. Setup Solution

Solution Structure: In this example I am using a person database and the only function of this application would be to Create-Read-Update-Delete a person record. To achieve this I have created a simple solution structure, using the following structure.

image

Here I am using app folder for most of my application code, this folder contains angular controller, service and views. I am keeping my node modules very simple just to expose API’s which will be consumed by AngularJS services. I have covered most of the server side node modules in just one file called server.js  I will try to cover specially more complex node module/architecture in my next post.

3. Downloading, Installation and configuration of Dependencies

Download and installation instructions: For this example my application is using following node-modules, to install these modules I am using node package manager using GitBash.

  • body-parser: It’s a node.js body parsing middleware, for simple application, for mode detail please refer, https://github.com/expressjs/body-parser
    • Installation: $ npm install body-parser
  • express: Express is a minimal and flexible Node.js web application framework that provides a robust set of features for web and mobile applications (Reference: http://expressjs.com/). I am using express in my middleware to create node.js server and API’s
    • Installation: $ npm install express
  • mongojs : A node.js module for mongodb, that emulates the official mongodb API as much as possible. It wraps mongodb-core and is available through npm (Reference: https://github.com/mafintosh/mongojs).
    • Installation: $ npm install mongojs
  • angular-loading-bar : I am using this for my client side for displaying progress bar. Interesting thing is due to the fast performance of my application I never get to see this in action.
    • Installation: $ npm install angular-loading-bar
Configuration: of AngularJS client application

4. Setup Middleware

I am using my node.js for middleware, to setup my middleware first I have created my server in node.js

Here in line 1 and 2 above I have imported express module and created a instance of express, and in line 3 I have used listen function of express to create new server on port 3000. With this example my server is listening on port 3000. When I run this in GitBash I will get the following message.

image

I still need to add few more modules to my server.js which will help the application to connect to my mongodb server, configure the application working directories. But before I do that let me first start creating GET/POST/PUT and DELETE API’s in server.js which can communicate with my Client module in AngularJS

4. Create GET/POST/PUT and DELETE server API’s

In the above code I have create a blueprint of my middleware in server.js

Node JS API Controller Description

/persons

Get All Persons
/person/:id Get single person by person Id
/addPerson Add new person
/deletePerson/:id Delete single person by id
/updatePerson Update single person

in the first few lines of my code I have created the instances of express and configure my application to use my root directory of the application as a working folder. Now lets run the server and check the output in web  browser.

In this example I have created stubs for Get by Id, GetAll, Update, and Delete person request. To test the code start the node server, though initially you can see the same output what I have provided earlier. But when you enter the URL’s in the browser with the URL’s on which I have created my API’s you can see the outputs in the console.

I have provided example of GET requests. I will write Angular modules to test POST/PUT and DELETE API’s

GET All (localhost/persons)

image

image

image

GET By ID (localhost/person/1)

image

5. Configure Angular JS Client

I am using simple MVC model in AngularJS Client, but you can fork or clone my Git repository to extend this solution. This example assume that you have basic understanding of Angular JS. To start with lets create a simple router which redirects my application to the home page of my person page.

Angular JS Routing

Here in the example above, I have mapped the paths /person and /person/:PersonId with the PersonCtrl and PersonAddressCtrl  controller respectively and corresponding view as app/views/person.html and app/views/persondetail.html views respectively. Which means when I hit the URL: http://localhost:3000/person, this will go to the person.html and mapped to the PersonCtrl.js controller and when I hit a single person using URL: http://localhost:3000/person/1 out of the list using personId my routing engine will redirect the users to persondetail.html using PersonAddressCtrl.js

Angular Controllers – PersonCtrl

onError : This is going to be my common error module, this action method is used only for the purpose of displaying the error when something goes wrong during the whole communicate process wither at the server level or at the client level.

refresh : This will be my default get all person action method, which is used to load all the person data from mongodb database via nodejs. Here in this code I am calling the corresponding API ‘/persons’ which is created in nodejs and express to populate all the persons from mongodb. Once the data is returned from mongodb I have created a callback function onPersonGetCompleted event, which is used to populate the person data in the scope of the current module. refresh() is called as soon as person.html my default page is loaded in the application or every time we need to refresh my data from mongodb.

searchPerson : This action method will be called when I want to view a single person based on person id, for example during opening the update form for the selected person. This method is associated with the callback function onGetByIdCompleted, which populates the scope with a single person result from mongodb.

addPerson : As name suggest, this action method is used to add a new person, and the associated call back function to this is onAddPersonCompleted, which refresh the data once add is completed to reflect the uptodate information on the Person default page.

deletePerson : As the name suggest, only purpose of this action method is used to delete the selected person. This calls '/deletePerson/:id' API of nodejs and once the call returns it uses its callback function onPersonDeleteCompleted to refresh the update data in the HTML page person.html

updatePerson : This action updates the person information in mongodb. This calls the ‘/updatePerson’ API in nodejs. Once the update completed, it calls the callback function onUpdatePersonCompleted action in Angular to refresh the updated data in the Person page.

Let's put everything together to see my completed PersonCtrl.js, next I am going to show how this is going to integrate with the View and the final step will be to write the body of my API controller which will be used to transact with mongodb.

5. Completing the NodeJS express API Controller

Now lets revisit the API controller, the skleton of which I had created in the Step 4 above. Here I am going to write the body of my GET/POST/PUT and DELETE methods which will be used to transact with mongodb document database. In the chart below I have shown the NodeJS express API and corrosponding mongodb API against that which will be called to complete the transactions.

Action NodeJS Express API MongoDB API
Get All Persons /persons db.Persons.find()
Get Person by Id /person/:id db.Persons.findOne()
Add new Person /addPerson db.Persons.insert()
Delete Person by Id /deletePerson/:id db.Persons.remove()
Update single Person /updatePerson db.Persons.findAndModify()

If you want to extend your knowledge beyond the list provided above, then I would suggest you to look into the very detailed and excellent documentation provided by MongoDB : http://docs.mongodb.org/manual/applications/crud/

Now lets look into the completed API of my nodejs, Here in line #4, AddressBook is mu collection name which is equivalent to the tables in RDBMS and Person is the document. If you are still confused with the document and collection and other jargons of MongoDB then probably you can look into the link: http://docs.mongodb.org/manual/reference/glossary/, this covers all the mogodb glossary.

I have also introduced body-parser here, which is used to parse the request and response in json format.

And again, for an extensive explanation of each of the operations I have used in this example below you can refer the MongoDB documentation: http://docs.mongodb.org/manual/applications/crud/

This is the least code you can write to implement the CRUD operation using mongodb and node, but the possibilities are limitless you you want to extend this example. Some of the things which you can do with very little effort if you want to make this solution more structure and object oriented are:

1. Use the application generator tool, express, to quickly create an application skeleton. http://expressjs.com/starter/generator.html

2. Use ORM/ODM tools like mongoose. Mongoose provides a straight-forward, schema-based solution to model your application data. It includes built-in type casting, validation, query building, business logic hooks and more, out of the box.

3. Use routing in Server side, though I have used routing in my client side but you can also use routing in middleware to keep more control of your code. You can learn more on these topics @ http://expressjs.com/starter/basic-routing.html OR http://expressjs.com/guide/routing.html

This is just few of options which can be done with very little effort but the possibilities are unlimited. You can fork or clone my example to extend it up to your capacity for learning purpose.

Now lets get back to the pending items which will wind up my post by creating Views

6. Angular JS and HTML Views

I am going to create a Dashborad which will display all the persons I have in my mongodb database in AddressBook collection. I have provided below the screenshot to show you how the completed code will look like. Here I have used AngularJS bindings for model binding and CSS3 and HTML5 to make my page look little descent.On click of Edit I am going to edit the person using a Model popup with pre-populated person data and similarly I am going to have a model popup for Add Person with empty controls which will allow the user to add a new person record and delete is simply going to remove the record from the mongodb. My example is designed to take you to the persondetail.html page which will have more details of the person, like person address but for the simplicity sake I have not implemented that in this example.

image

image

Now to achieve this I will need only few pages, which are listed below

index.html : This page is going to have only the references to the controller, angularjs, bootstrap and other client side libraries. This page also have a

<div data-ng-view="data-ng-view"></div>

This acts as a place holder for displaying the views injected by angularjs routing engine.

person.html

This is the home page or default page in my application. Screenshots of this page is provided above. The purpose of this page is to list all the persons in the mongodb database, add, edit and delete the persons from mongodb.

I have used a very loosely defined model called person for my model and this is going to be saved in mongodb directly without any further transformation. My view is binded with persons model in the $scope through the PersonCtrl. Once the model is populated I have used ng-repeat to display all the persons in the table. In line #17, I have called the searchPerson with personId in PersonCtrl to get selected person to edit from mongodb. Here data-toggle and data-target is used to tell the Edit button to display the #personEditModal is clicked. And for delete in line #18 I have called deletePerson action in PersonCtrl, this in turn calls the NodeJs API for delete and remove the person from mongodb database.

And for Add and Edit modal popup, I have a very basic design, refer the screenshot above. I have provided below the example of edit person, only difference of edit and add person is the display captions/headings and on click of save button, Edit calls the updatePerson(person) action in PersonCtrl and in add person modal this calls the addPerson(person) action in PersonCtrl.

So with these code I have completed end to end data bindings, client side API using angular, server side API using nodejs and express and you might have noticed I have not done much like schema creation in mongodb data as mongodb is schema less document oriented database. So the same JSON which I have used to communicate between the different layers are stored directly into the mongodb database. But again this is just the start, the things I have not covered in mongodb is relation data example like person and person address documents and how they can be associated with each other using references, complex model bindings, etc. You can Clone, Form or download this example from my git repository @ https://github.com/bmdayal/MEANSample

References and Other helpful links

Share:

Thursday, 15 October 2015

Best practices for the iOS UIViewController and Firebase


David East
David East
Developer Advocate

The UIViewController comes with a lifecycle that informs us when important events occur. Events such as viewDidLoad,viewWillAppear, viewDidDisappear, and the always fun "Stop! You're using too much memory!" warning.
The UIViewController's lifecycle is a huge convenience. But it can be confusing to know when to do what. This article will cover a few best practices to help you develop with confidence.
Since we're developers, we'll use a zero-based index for this list.

0. Initialize references in viewDidLoad

override func viewDidLoad() {
super.viewDidLoad()
ref = Firebase(url: "https://<YOUR-FIREBASE-APP>.firebaseio.com")
}
- (void)viewDidLoad {
[super viewDidLoad];
self.ref = [[Firebase alloc] initWithUrl:@"https://<YOUR-FIREBASE-APP>.firebaseio.com"];
}
You can't count on a UIViewController's initializer. This is because controllers that come from the Storyboard don't have their initializer called. This leaves us with the viewDidLoad method.
The viewDidLoad method is usually the first method we care about in the UIViewController lifecycle. Since viewDidLoad is called once and only once in the lifecycle, it's a great place for initialization.
The viewDidLoad method is also called whether you use Storyboards or not. Outlets and properties are set at this point as well. This will enable you to do any dynamic creation of a reference's location.

1. Initialize references with implicitly unwrapped optionals (Swift-only)

class ViewController : UIViewController {
var ref: Firebase!

override func viewDidLoad() {
super.viewDidLoad()
ref = Firebase(url: "https://<YOUR-FIREBASE-APP>.firebaseio.com")
}
}
In Swift all properties' values must be set before initialization is complete. And that's a big problem. You can't rely on a UIViewController's initializer to ever be called. So how do you set the value for the Firebase reference property? Use an implicitly unwrapped optional.
By unwrapping the property the compiler will assume the value will exist by the time it's called. If the value is nil when called, it'll crash the app. That won't happen for a reference property if the value is set in viewDidLoad.
Using an implicitly unwrapped optional is you telling the compiler: "Chill-out, I know what I'm doing."
You might be wondering why you shouldn't inline the value.
class ViewController : UIViewController {
let ref = Firebase(url: "https://<YOUR-FIREBASE-APP>.firebaseio.com")
}
There's no problem using an inline value. You're just limited to static values since you can't use other properties or variables.
class ViewController : UIViewController {
// This won't compile :(
let ref = Firebase(url: "https://my.firebaseio.com/\(myCoolProperty)")
}

2. Create listeners in viewWillAppear, not in viewDidLoad

override func viewWillAppear(animated: Bool) {
super.viewWillAppear(animated)
ref.observeEventType(.Value) { (snap: FDataSnapshot!) in print (snap.value) }
}
- (void)viewWillAppear:(BOOL)animated {
[super viewWillAppear:animated];
[self.ref observeEventType:FEventTypeValue withBlock:^(FDataSnapshot *snapshot) {
NSLog(@"%@", snapshot.value);
}];
}
Your app should be a good citizen of battery life and memory. To preserve battery life and memory usage, you should only synchronize data when the view is visible.
The viewWillAppear method is called each time the view becomes visible. This means if you set your listener here, your data will always be in sync when the view is visible.
You should avoid creating listeners in viewDidLoad. Remember that viewDidLoad only gets called once. When the view disappears you should remove the listener. This means the data won't re-sync when the view becomes visible again.

3. Remove listeners in viewDidDisappear with a FirebaseHandle

class ViewController : UIViewController {
var ref: Firebase!
var handle: UInt!

override func viewWillAppear(animated: Bool) {
super.viewWillAppear(animated)
handle = ref.observeEventType(.Value) { (snap: FDataSnapshot) in print (snap.value) }
}
}
@interface ViewController()
@property (nonatomic, strong) Firebase *ref;
@property FirebaseHandle handle;
@end

@implementation ViewController

- (void)viewDidLoad {
[super viewDidLoad];
self.ref = [[Firebase alloc] initWithUrl:@"https://<YOUR-FIREBASE-APP>.firebaseio.com"];
}

- (void)viewWillAppear:(BOOL)animated {
[super viewWillAppear:animated];
self.handle = [self.ref observeEventType:FEventTypeValue withBlock:^(FDataSnapshot *snapshot) {
NSLog(@"%@", snapshot.value);
}];
}

@end
To remove a listener in iOS you need a FirebaseHandle. A FirebaseHandle is just a typealias for a UInt that keeps track of a Firebase listener.
Note that in Swift you need to use an implicitly unwrapped optional since the value can't be set in the initializer. The handle's value is set from the return value of the listener.
Use this handle to remove the listener in viewDidDisappear.
override func viewDidDisappear(animated: Bool) {
super.viewDidDisappear(animated)
ref.removeObserverWithHandle(handle)
}
-(void)viewDidDisappear:(BOOL)animated {
[super viewDidDisappear:animated];
[self.ref removeObserverWithHandle:self.handle];
}
If your controller is still syncing data when the view has disappeared, you are wasting bandwidth and memory.

Leaky Listeners

A leaky listener is a listener that is consuming memory to store data that isn't displayed or accessed. This is especially an issue when navigating using a UINavigationController, since the root controller isn’t removed from memory when navigating to a detail controller. This means a root controller will continue to synchronize data if the listener isn't removed when navigating away. This action takes up needless bandwidth and memory.


The thought of removing the listener might sound unappealing. You may think you need to keep your listener open to avoid downloading the same data again, but this is unnecessary.
Firebase SDKs come baked in with caching and offline data persistence. These features keep the client from having to fetch recently downloaded data.


4. Enable offline in the AppDelegate's initializer

class AppDelegate: UIResponder, UIApplicationDelegate {
var window: UIWindow?

override init() {
Firebase.defaultConfig().persistenceEnabled = true
}
}
@implementation AppDelegate

- (instancetype)init
{
self = [super init];
if (self) {
[[Firebase defaultConfig] setPersistenceEnabled:YES];
}
return self;
}

@end

Speaking of offline, this tip isn't UIViewController specific, but it's important. Offline has to be set before any other piece of Firebase code runs.
Your first instinct might be to enable offline in the AppDelegate's application:didFinishLaunchingWithOptions method. This will work for most situations, but not for all. This can go wrong when you inline a Firebase property's value in the root UIViewController. The value of the Firebase property is set before application:didFinishLaunchingWithOptions gets called, which will cause the SDK to throw an exception.
By setting up the offline config in the AppDelegate init we can avoid this issue.

Final example


Check out this gist to see the final version of the UIViewController.


Takeaways

If you remember anything remember these things:
  • Initialize references in viewDidLoad
  • Synchronize data only when the view is visible
  • Store a handle to simplify removing a reference
  • Remove listeners when the view in not visible
  • Configure offline persistence in the AppDelegate's initializer
How do you manage Firebase in your UIViewController? Let us know if you're using any of these today or if you have any best practices of your own.
Share:

Tuesday, 13 October 2015

Divshot has Joined Firebase!

Michael Bleigh
Michael Bleigh
Engineer

Today we're excited to announce that front-end web hosting service Divshot has joined Firebase!

Both teams share a passion for creating fantastic developer experiences, and now we're moving forward together toward that common goal. Divshot and Firebase have teamed up before: Firebase sponsored Divshot's Static Showdown hackathon and was used by more than 50 percent of the developers who participated.

As a cofounder of Divshot, I'm excited to bring the best parts of Divshot's technology to Firebase Hosting. We're launching a brand new Firebase command-line interface today with a local static web server powered by Divshot's open-source Superstatic library. Now you can develop locally with all of your rewrites, redirects, and other Firebase Hosting options. There's more on the way, so stay tuned!

Moving forward the team will be fully focused on making Firebase Hosting a fantastic developer experience, so Divshot will shut down existing products and services on Monday, December 14, 2015. For existing Divshot customers, we have a detailed migration guide to help you easily transition to Firebase Hosting. The transition should be quick and painless (especially because many of you are already Firebase developers!).

I want to thank every Divshot user for their support over the past three years. I’ve been blown away by your creativity and community spirit, and I can't wait to see what you all build next.

Happy coding!

Share: