Example of Evernote Android client using a Clean Architecture
This client uses Evernote Sandbox API. For use this application you have to login using a Evernote Sandbox account https://sandbox.evernote.com.
The project is developed using MVP (Model View Presenter) and Clean Architecture.
On first attempt, I tried to get data using the public Evernote Cloud API, but the oficial documentation is not available anymore (http://dev.evernote.com/documentation/cloud/).
So, in this project I've used Evernote SDK for Android as data source (https://github.com/evernote/evernote-sdk-android).
Evernote uses OAuth 1.0 as authentication system (https://dev.evernote.com/doc/articles/authentication.php). So, I couldn't develop my own view where the user can write his own credentials, instead I had to launch a WebView, wait for the user to write credentials and accept permission and then receive back the obtained token using an android scheme.
On the first attempt, I did this login system by myself, it's pushed in cloud-api-login branch. But then, I noticed that I can't use the Evernote Cloude API because the documentation is not available, so I had to use the authentication system which is included in Evernote SDK for Android.
Notes are obtained by NoteStore.findNotes method, then for each note there is a call to NoteStore.getNote in order to obtain note contents.
Notes are listed in a RecyclerView using a GridLayoutManager with 2 columns.
List notes view has a option menu which provides the user the option of sort the notes using date or title. This sort algorithm is used implementing Comparator interface (comparator implementations) with Collection.sort method.
When a note is tapped in list notes view, a new view is opened which shows more information about the note, like author and creation date.
There is an option to add new note where user can write title, content and author. Notes are added using NoteStore.createNote method.
Also, new notes can be written by handwriting.
As draw panel I've developed a custom dialog with a custom view where user can draw and when OK button is pressed, generated bitmap is sent back to the activity who call the dialog. Obtained bitmap is sent to an OCR engine, which reads text and then obtained text is written in focused EditText.
I've added Tesseract as OCR engine https://code.google.com/p/tesseract-ocr/ with eng language by default. But this language doesn't detect most handwriting letter. I can train a handwriting language using TrainingTesseract3 but it isn't easy and takes time.
- Instrumentation tests
- UI tests
- Trained handwriting tesseract language
- More beautiful UI
- Better error handling
- ENML markup parser
- Logout feature