Android Interview Questions

1. Main components of android 
2. What are Activities Life cycles, explain with an Example? 
3. Explain remote service. 
4. Explain pending intent. 
5. What is advantage of pending intent over intent? 
6. Explain Broadcast Receivers with an Example 
7. Explain content provider and content resolver. 
8. Explain content provider and content resolver with help of a sample code. 
9. What is .apk? How is it generated? 
10. What is .aidl file? Can we use any different name instead of .aidl and create interface using any commands? 
11. What are threads? 
12. What are the two ways of implementing threads and which is the better one? 
13. startBroadcast(), startOrderBroadcast() and startStickyBroadcast() 
14. What are the different states of a thread? 
15. Explain Widget in android? 
16. Explain about android runtime environment. 
17. What is a Launcher? 
18. Explain init.c with respect to Dalvik Virtual Machine? 
19. Explain Services?Different type of services? 
20. Explain aidl interface and its implementation? 
21. Explain remote service with examples? 
22. Explain intent filter and intents? 
23. If an activity ‘A’ starts activity ‘B’ and ‘B’ starts another activity ‘C’,what is the method to call to finish all activities? 
24. Explain startService() and bindService()? 
25. What is content Resolver?Is it a wrapper Class? 
26. What is the difference between content receiver and content Provider? 
27. Explain Implicit and explicit intent and its flow? 
28. Explain Activity Lifecycle 
29. Explain Service Lifecycle? 
30. Local service and remote service? 
31. Explain AIDL concept? 
32. What are Broadcast Receivers and how to register them statically and ynamically? 
33. What are Content providers and how do we access the data? 
34. What adb commands you have used? 
35. What is manifest file? 
36. What is category,action and data in intents and how they work? 
37. Shared preference - how to get it? 
38. Android listview and adapters 
39. How android manage applications on low memory? 
40. Pass data across different applications 
41. How to start Activity of another application 
42. How to use Bundles and pass aprameters? 
43. What all are mentioned in AndroidManifest.xml 
44. How to make an Activity to launch at start 
45. Can we have two launcher Activities? 
46. How to make onCreate execute multiple times? 
47. Explain Recievers? 
48. How to get intent in a newly launched Activity 
49. How to troubleshoot when adb is not detecting device 
50. List some adb commands 
51. If I want to save the state of activity,in which method of lifecycle , I have to store? 
52. When OnsaveInstantState method will be called? 
53. Difference between service and broadcast-receiver? 
54. What is Pending Intent
55. Difference between Activity and view
56. What is Shared Preference and difference with normal preference
57. What are the killable methods in Activity life cycle?
58· Can Activity run in background? 
59· What is layout and what are the different types of layouts? 
60· What is the difference between linear and relative layout? 
61· Can linear layout be nested in relative layout and vice versa? 
62· How to position the layout? 
63. Diff b/w SimpleCursorAdapter and CursorAdapter 
64. What is Handlers and its implementation. 
65. How Library and frameWork layer communicating. 
66. What is JNI.How it works. 
67. Difference between cursor adapter and Simple cursor adapter. 
68. How do you debug your application.

Audio Module Changes in Android Verions

Android Jelly Bean(4.2):

Low-latency audio:

Android 4.2 improves support for low-latency audio playback, starting from the improvements made in Android 4.1 release for audio output latency using OpenSL ES, Soundpool and tone generator APIs. These improvements depend on hardware support — devices that offer these low-latency audio features can advertise their support to apps through a hardware feature constant. New AudioManager APIs are provided to query the native audio sample rate and buffer size, for use on devices which claim this feature.

Android Jelly Bean(4.1):

Media codecs: The MediaCodec class provides access to low-level media codecs for encoding and decoding your media. You can instantiate a MediaCodec by calling createEncoderByType() to encode media or call createDecoderByType() to decode media. Each of these methods take a MIME type for the type of media you want to encode or decode, such as"video/3gpp" or "audio/vorbis".

With an instance of MediaCodec created, you can then call configure() to specify properties such as the media format or whether or not the content is encrypted.

Record audio on cue:
New method startRecording() allows you to begin audio recording based on a cue defined by a MediaSyncEvent. TheMediaSyncEvent specifies an audio session (such as one defined by MediaPlayer), which when complete, triggers the audio recorder to begin recording. For example, you can use this functionality to play an audio tone that indicates the beginning of a recording session and recording automatically begins so you don't have to manually synchronize the tone and the beginning of recording.

Timed text tracks:
The MediaPlayer now handles both in-band and out-of-band text tracks. In-band text tracks come as a text track within an MP4 or 3GPP media source. Out-of-band text tracks can be added as an external text source viaaddTimedTextSource() method. After all external text track sources are added, getTrackInfo() should be called to get the refreshed list of all available tracks in a data source.

To set the track to use with the MediaPlayer, you must call selectTrack(), using the index position for the track you want to use.


Audio effects:
The AudioEffect class now supports additional audio pre-processing types when capturing audio:
  • Acoustic Echo Canceler (AEC) with AcousticEchoCanceler removes the contribution of the signal received from the remote party from the captured audio signal.
  • Automatic Gain Control (AGC) with AutomaticGainControl automatically normalizes the output of the captured signal.
  • Noise Suppressor (NS) with NoiseSuppressor removes background noise from the captured signal.
You can apply these pre-processor effects on audio captured with an AudioRecord using one of the AudioEffect subclasses.

Gapless playback:
You can now perform gapless playback between two separate MediaPlayer objects. At any time before your first MediaPlayer finishes, call setNextMediaPlayer() and Android attempts to start the second player the moment that the first one stops.

Android Ice Cream Sandwich(4.0):

Low-level streaming multimedia:
Android 4.0 provides a direct, efficient path for low-level streaming multimedia. The new path is ideal for applications that need to maintain complete control over media data before passing it to the platform for presentation. For example, media applications can now retrieve data from any source, apply proprietary encryption/decryption, and then send the data to the platform for display.

To support this low-level streaming, the platform introduces a new native API based on Khronos OpenMAX AL 1.0.1.

Audio remote controls:Android 4.0 adds a new audio remote control API that lets media applications integrate with playback controls that are displayed in a remote view. Media applications can integrate with a remote music playback control that’s built into in the platform’s lock screen, allowing users to control song selection and playback without having to unlock and navigate to the music app.

Using the audio remote control API, any music or media app can register to receive media button events from the remote control and then manage play state accordingly. The application can also supply metadata to the remote control, such as album art or image, play state, track number and description, duration, genre, and more.

New media codecs and containers:
Android 4.0 adds support for additional media types and containers to give developers access to the formats they need. For high-quality compressed images, the media framework adds support for WebP content. For video, the framework now supports streaming VP8 content. For streaming multimedia, the framework supports HTTP Live streaming protocol version 3 and encoding of ADTS-contained AAC content. Additionally, developers can now use Matroska containers for Vorbis and VP8 content.