Thursday, July 7, 2011

Short Codes useful for mac apps



Hi!

This blog will contain some very basic concepts that we mostly use in iphone apps but not sure how to implement in MacApps.Hope it will save your time in researching.


1. Loading a url in a webView for Mac App


[ [web mainFrame ] loadRequest: [ NSURLRequest requestWithURL: [NSURL URLWithString:@"http://www.google.com/" ] ];

Here ,"web" is an object of WebView.

2. Get the tableView row number that was selected (using single tap)


For this you need to set the tableView delegate & use the below method
- (BOOL)tableView:(NSTableView *)tableView shouldSelectRow:(NSInteger)row{
rowSelected = row;
return YES;
}

3. Showing an alert Message


NSString *theAlertMessage = [NSString stringWithFormat: @"Already added to rhymes sticky."];
NSAlert *alert = [[[NSAlert alloc] init] autorelease];
[alert setAlertStyle:NSCriticalAlertStyle];
[alert addButtonWithTitle:@"OK"];
[alert setMessageText:theAlertMessage];

Thursday, June 9, 2011

Flurry Integration


Flurry Integration

Introduction


The Flurry Analytics Agent allows you to track the usage and
behavior of your application on users' phones for viewing in the Flurry Analytics system.
Note
1. Flurry SDK will only work with Xcode 3.2.5 or above. If you need an SDK for an
 older Xcode version please email support.
2. Flurry Agent does not require CoreLocation framework and will not collect GPS 
location by default. Developers who use their own
CLLocationManager can set GPS location information in the Flurry Agent 
(see Optional Features for more information).

Integration


1. In the finder, drag FlurryLib into project's file folder. (NOTE: If you are upgrading
the Flurry iPad SDK, be sure to remove any existing Flurry library folders from your
project's file folder before proceeding.)
2. Now add it to your project: Project > Add to project > FlurryLib - Choose
 'Recursively create groups for any added folders'
3. In your Application Delegate:
a. Import FlurryAPI: #import "FlurryAPI.h"
b. Inside "applicationDidFinishLaunching:" add: [FlurryAPI startSession:@"YOUR_API_KEY"];
- (void)applicationDidFinishLaunching:(UIApplication *)application {

[FlurryAPI startSession:@"YOUR_API_KEY"];
//your code


}

Features

a) Tracking user Behavior
[FlurryAPI logEvent:@"EVENT_NAME"];
Use logEvent to count the number of times certain events happen during a session of
 your application. This can be useful for measuring how often users perform various
actions, for example. Your application is currently limited to counting occurrences for
 300 different event ids (maximum length 255 characters).
[FlurryAPI logEvent:@"EVENT_NAME" withParameters:YOUR_NSDictionary];
Use this version of logEvent to count the number of times certain events happen during
a session of your application and to pass dynamic parameters to be recorded with that
event. Event parameters can be passed in as a NSDictionary object where the key and
value objects must be NSString objects. For example, you could record that a user used
your search box tool and also dynamically record which search terms the user entered.
 Your application is currently limited to counting occurrences for 300 different event ids
 (maximum length 255 characters). Maximum of 10 event parameters per
event is supported.
[FlurryAPI logEvent:@"EVENT_NAME" timed:YES];
Use this version of logEvent to start timed event.
[FlurryAPI logEvent:@"EVENT_NAME" withParameters:YOUR_NSDictionary 
timed:YES];
Use this version of logEvent to start timed event with event parameters.
[FlurryAPI endTimedEvent:@"EVENT_NAME" withParameters:
YOUR_NSDictionary];
Use endTimedEvent to end timed event before app exists, otherwise timed events
 automatically end when app exists. When ending the timed event, a new event
parameters NSDictionary object can be used to update event parameters.
To keep event parameters the same, pass in nil for the event parameters NSDictionary
 object.
[FlurryAPI logAllPageViews:navigationController];
To enable Flurry agent to automatically detect and log page view, pass in an
instance of UINavigationController or UITabBarController to countPageViews.
 Flurry agent will create a delegate on your object to detect user interactions.
Each detected user interaction will automatically be logged as a page view.
Each instance needs to only be passed to Flurry agent once. Multiple
UINavigationController or UITabBarController instances can be passed to Flurry agent.
[FlurryAPI logPageView];
In the absence of UINavigationController and UITabBarController, you can
manually detect user interactions. For each user interaction you want to manually log,
 you can use logPageView to log the page view.

b) Tracking Application Errors
[FlurryAPI logError:@"ERROR_NAME" message:@"ERROR_MESSAGE" 
exception:e];
Use this to log exceptions and/or errors that occur in your app. Flurry will report the
 first 10 errors that occur in each session.

c) Tracking Demographics
[FlurryAPI setUserID:@"USER_ID"];
Use this to log the user's assigned ID or username in your system after identifying
the user.
[FlurryAPI setAge:21];
Use this to log the user's age after identifying the user. Valid inputs are 0 or greater.
[FlurryAPI setGender:@"m"];
Use this to log the user's gender after identifying the user. Valid inputs are (male)
 or (female)

d) Tracking Location
CLLocationManager *locationManager = [[CLLocationManager alloc] init]; 
[locationManager startUpdatingLocation]
CLLocation *location = locationManager.location; [FlurryAPI setLatitude:
location.coordinate.latitude
longitude:location.coordinate.longitude horizontalAccuracy:location
.horizontalAccuracy verticalAccuracy:location.verticalAccuracy];
This allows you to set the current GPS location of the user. Flurry will keep only
the last location information. If your app does not use location services in a meaningful
way, using CLLocationManager can result in Apple rejecting the app submission.

e) Controlling Data Reporting
[FlurryAPI setSessionReportsOnCloseEnabled:(BOOL)sendSessionReportsOnClose];
This option is on by default. When enabled, Flurry will attempt to send session data
when the app is exited as well as it normally does when the app is started.
This will improve the speed at which your application analytics are updated
but can prolong the app termination process due to network latency.
 In some cases, the network latency can cause the app to crash.
[FlurryAPI setSessionReportsOnPauseEnabled:(BOOL)sendSessionReportsOnPause];
This option is on by default. When enabled, Flurry will attempt to send session
 data when the app is paused as well as it normally does when the app is started.
This will improve the speed at which your application analytics are updated
 but can prolong the app pause process due to network latency.
 In some cases, the network latency can cause the app to crash.

"EXC _BAD_ACCESS" error handling


All of us are well aware of this error .Most of us know the solution also for this, that is "Using Zombies".You get your problem fixed also using it. There are few scenarios when you get the kind of object was released but we dont get the exact line number where the problem is even after spending long time on debugging.Such a problem happened with me also & found a short simple solution for it.My this blog will through light on using zombies in your project & also how you can get the exact line number.I am sure it will be helpful for you all.

Introduction



This kind of problem is usually the result of over-releasing an object. It can be very confusing, since the failure tends to occur well after the mistake is made. The crash can also occur while the program is deep in framework code, often with none of your own code visible in the stack.

Enabling Zombies


Steps

1. Select Groups & Files > Executables

Click for full-size image
2. Select your project within Executables & press info button , following screen pops up


Click for full-size image
3. Select Arguments from top segmented control


Click for full-size image


4. Now select the botton "+" button

Click for full-size image

& add the following key value pairs
NSZombieEnabled = YES
CFZombie = 5
MallocStackLoggingNoCompact = 1

Click for full-size image


Debugging "EXC_BAD_ACESS" error


After enabling zombie & running the project you get something like


(gdb) continue
2011-06-09 11:46:08.404 test [6842:40b] *** -[_NSArrayI release]:message sent to deallocated instance 0X64a4900

in your console
Then add

(gdb) info malloc-history 0x64a4900
it will show the lines that was creating the error

Friday, May 6, 2011

Printing Option in Mac App



- (void)print:(id)sender {
// page settings for printing
[self setPrintInfo:[NSPrintInfo sharedPrintInfo]];
[printInfo setVerticalPagination:NSAutoPagination];
float horizontalMargin, verticalMargin;
horizontalMargin = 0;
verticalMargin = -100;
[printInfo setLeftMargin:horizontalMargin];
[printInfo setRightMargin:horizontalMargin];
[printInfo setHorizontallyCentered:YES];
[printInfo setTopMargin:-600];
[printInfo setBottomMargin:verticalMargin];
 [sampleText setHidden:NO];
[[NSPrintOperation printOperationWithView:staticText] runOperation];
}

Connect this method to the button you want to provide the print option

Thursday, April 21, 2011

Audio Recording in Mac App


I had spent much time researching on this .Finally , I am happy to share this with all of you.


Steps for integrating this includes :

1. Add the following frameworks to your app as existing frameworks

a) QTKit.framework
b) AudioUnit.framework
c) AudioToolbox.framework

2. in your .h file add the below code

#import <Cocoa/Cocoa.h>
#import <AudioUnit/AudioUnit.h>
#import <AudioToolbox/AudioToolbox.h>
@class QTCaptureSession;
@class QTCaptureDeviceInput;
@class QTCaptureDecompressedAudioOutput;
@interface CaptureSessionController : NSObject <NSWindowDelegate> {
IBOutlet NSWindow *window;
@private
QTCaptureSession *captureSession;
QTCaptureDeviceInput *captureAudioDeviceInput;
QTCaptureDecompressedAudioOutput *captureAudioDataOutput;
AudioUnit effectAudioUnit;
ExtAudioFileRef extAudioFile;
AudioStreamBasicDescription currentInputASBD;
AudioBufferList *currentInputAudioBufferList;
double currentSampleTime;
BOOL didSetUpAudioUnits;
NSString *outputFile;
BOOL recording;
}
@property(copy) NSString *outputFile;
@property(getter=isRecording) BOOL recording;
- (IBAction)chooseOutputFile:(id)sender;
@end


3. In your .m file add the below code

#import "CaptureSessionController.h"
#import <QTKit/QTKit.h>
static OSStatus PushCurrentInputBufferIntoAudioUnit(void * inRefCon,
AudioUnitRenderActionFlags * ioActionFlags,
const AudioTimeStamp * inTimeStamp,
UInt32 inBusNumber,
UInt32 inNumberFrames,
AudioBufferList * ioData);
@implementation CaptureSessionController
#pragma mark ======== Setup and teardown methods =========
- (id)init
{
self = [super init];
if (self) {
[self setOutputFile:[@"~/Desktop/Audio Recording.aif" stringByStandardizingPath]];
}
return self;
}
- (void)awakeFromNib
{
BOOL success;
NSError *error;
/* Find and open an audio input device. */
QTCaptureDevice *audioDevice = [QTCaptureDevice defaultInputDeviceWithMediaType:QTMediaTypeSound];
success = [audioDevice open:&error];
if (!success) {
[[NSAlert alertWithError:error] runModal];
return;
}
/* Create the capture session. */
captureSession = [[QTCaptureSession alloc] init];
/* Add a device input for the audio device to the session. */
captureAudioDeviceInput = [[QTCaptureDeviceInput alloc] initWithDevice:audioDevice];
success = [captureSession addInput:captureAudioDeviceInput error:&error];
if (!success) {
[captureAudioDeviceInput release];
captureAudioDeviceInput = nil;
[audioDevice close];
[captureSession release];
captureSession = nil;
[[NSAlert alertWithError:error] runModal];
return;
}
/* Create an audio data output for reading captured audio buffers and add it to the capture session. */
captureAudioDataOutput = [[QTCaptureDecompressedAudioOutput alloc] init];
[captureAudioDataOutput setDelegate:self]; /* Captured audio buffers will be provided to the delegate via the captureOutput:didOutputAudioSampleBuffer:fromConnection: delegate method. */
success = [captureSession addOutput:captureAudioDataOutput error:&error];
if (!success) {
[captureAudioDeviceInput release];
captureAudioDeviceInput = nil;
[audioDevice close]; 
[captureAudioDataOutput release];
captureAudioDataOutput = nil;
[captureSession release];
captureSession = nil;
[[NSAlert alertWithError:error] runModal];
return;
}
/* Create an effect audio unit to add an effect to the audio before it is written to a file. */
OSStatus err = noErr;
AudioComponentDescription effectAudioUnitComponentDescription;
effectAudioUnitComponentDescription.componentType = kAudioUnitType_Effect;
effectAudioUnitComponentDescription.componentSubType = kAudioUnitSubType_Delay;
effectAudioUnitComponentDescription.componentManufacturer = kAudioUnitManufacturer_Apple;
effectAudioUnitComponentDescription.componentFlags = 0;
effectAudioUnitComponentDescription.componentFlagsMask = 0;
AudioComponent effectAudioUnitComponent = AudioComponentFindNext(NULL, &effectAudioUnitComponentDescription);
err = AudioComponentInstanceNew(effectAudioUnitComponent, &effectAudioUnit);
if (noErr == err) {
/* Set a callback on the effect unit that will supply the audio buffers received from the audio data output. */
AURenderCallbackStruct renderCallbackStruct;
renderCallbackStruct.inputProc = PushCurrentInputBufferIntoAudioUnit;
renderCallbackStruct.inputProcRefCon = self;
err = AudioUnitSetProperty(effectAudioUnit, kAudioUnitProperty_SetRenderCallback, kAudioUnitScope_Input, 0, &renderCallbackStruct, sizeof(renderCallbackStruct)); 
}
if (noErr != err) {
if (effectAudioUnit) {
AudioComponentInstanceDispose(effectAudioUnit);
effectAudioUnit = NULL;
}
[captureAudioDeviceInput release];
captureAudioDeviceInput = nil;
[audioDevice close];
[captureSession release];
captureSession = nil;
[[NSAlert alertWithError:[NSError errorWithDomain:NSOSStatusErrorDomain code:err userInfo:nil]] runModal];
return;
}
/* Start the capture session. This will cause the audo data output delegate method to be called for each new audio buffer that is captured from the input device. */
[captureSession startRunning];
/* Become the window's delegate so that the capture session can be stopped and cleaned up immediately after the window is closed. */
[window setDelegate:self];
}
- (void)windowWillClose:(NSNotification *)notification
{
[self setRecording:NO];
[captureSession stopRunning];
QTCaptureDevice *audioDevice = [captureAudioDeviceInput device];
if ([audioDevice isOpen])
[audioDevice close];
}
- (void)dealloc
{
[captureSession release];
[captureAudioDeviceInput release];
[captureAudioDataOutput release];
[outputFile release];
if (extAudioFile)
ExtAudioFileDispose(extAudioFile);
if (effectAudioUnit) {
if (didSetUpAudioUnits)
AudioUnitUninitialize(effectAudioUnit);
AudioComponentInstanceDispose(effectAudioUnit);
}
[super dealloc];
}
#pragma mark ======== Audio capture methods =========
/*
Called periodically by the QTCaptureAudioDataOutput as it receives QTSampleBuffer objects containing audio frames captured by the QTCaptureSession.
Each QTSampleBuffer will contain multiple frames of audio encoded in the canonical non-interleaved linear PCM format compatible with AudioUnits.
*/
- (void)captureOutput:(QTCaptureOutput *)captureOutput didOutputAudioSampleBuffer:(QTSampleBuffer *)sampleBuffer fromConnection:(QTCaptureConnection *)connection
{
OSStatus err = noErr;
BOOL isRecording = [self isRecording];
/* Get the sample buffer's AudioStreamBasicDescription, which will be used to set the input format of the effect audio unit and the ExtAudioFile. */
QTFormatDescription *formatDescription = [sampleBuffer formatDescription];
NSValue *sampleBufferASBDValue = [formatDescription attributeForKey:QTFormatDescriptionAudioStreamBasicDescriptionAttribute];
if (!sampleBufferASBDValue)
return;
AudioStreamBasicDescription sampleBufferASBD = {0};
[sampleBufferASBDValue getValue:&sampleBufferASBD]; 
if ((sampleBufferASBD.mChannelsPerFrame != currentInputASBD.mChannelsPerFrame) || (sampleBufferASBD.mSampleRate != currentInputASBD.mSampleRate)) {
/* Although QTCaptureAudioDataOutput guarantees that it will output sample buffers in the canonical format, the number of channels or the
sample rate of the audio can changes at any time while the capture session is running. If this occurs, the audio unit receiving the buffers
from the QTCaptureAudioDataOutput needs to be reconfigured with the new format. This also must be done when a buffer is received for the
first time. */
currentInputASBD = sampleBufferASBD;
if (didSetUpAudioUnits) {
/* The audio units were previously set up, so they must be uninitialized now. */
AudioUnitUninitialize(effectAudioUnit);
/* If recording was in progress, the recording needs to be stopped because the audio format changed. */
if (extAudioFile) {
ExtAudioFileDispose(extAudioFile);
extAudioFile = NULL;
}
} else {
didSetUpAudioUnits = YES;
}
/* Set the input and output formats of the effect audio unit to match that of the sample buffer. */
err = AudioUnitSetProperty(effectAudioUnit, kAudioUnitProperty_StreamFormat, kAudioUnitScope_Input, 0, &currentInputASBD, sizeof(currentInputASBD));
if (noErr == err)
err = AudioUnitSetProperty(effectAudioUnit, kAudioUnitProperty_StreamFormat, kAudioUnitScope_Output, 0, &currentInputASBD, sizeof(currentInputASBD));
if (noErr == err)
err = AudioUnitInitialize(effectAudioUnit);
if (noErr != err) {
NSLog(@"Failed to set up audio units (%d)", err);
didSetUpAudioUnits = NO;
bzero(&currentInputASBD, sizeof(currentInputASBD));
}
}
if (isRecording && !extAudioFile) {
/* Start recording by creating an ExtAudioFile and configuring it with the same sample rate and channel layout as those of the current sample buffer. */
AudioStreamBasicDescription recordedASBD = {0};
recordedASBD.mSampleRate = currentInputASBD.mSampleRate;
recordedASBD.mFormatID = kAudioFormatLinearPCM;
recordedASBD.mFormatFlags = kAudioFormatFlagIsBigEndian | kAudioFormatFlagIsSignedInteger | kAudioFormatFlagIsPacked;
recordedASBD.mBytesPerPacket = 2 * currentInputASBD.mChannelsPerFrame;
recordedASBD.mFramesPerPacket = 1;
recordedASBD.mBytesPerFrame = 2 * currentInputASBD.mChannelsPerFrame;
recordedASBD.mChannelsPerFrame = currentInputASBD.mChannelsPerFrame;
recordedASBD.mBitsPerChannel = 16;
NSData *inputChannelLayoutData = [formatDescription attributeForKey:QTFormatDescriptionAudioChannelLayoutAttribute];
AudioChannelLayout *recordedChannelLayout = (AudioChannelLayout *)[inputChannelLayoutData bytes];
err = ExtAudioFileCreateWithURL((CFURLRef)[NSURL fileURLWithPath:[self outputFile]],
kAudioFileAIFFType,
&recordedASBD,
recordedChannelLayout,
kAudioFileFlags_EraseFile,
&extAudioFile);
if (noErr == err) 
err = ExtAudioFileSetProperty(extAudioFile, kExtAudioFileProperty_ClientDataFormat, sizeof(currentInputASBD), &currentInputASBD);
if (noErr != err) {
NSLog(@"Failed to set up ExtAudioFile (%d)", err);
ExtAudioFileDispose(extAudioFile);
extAudioFile = NULL;
}
} else if (!isRecording && extAudioFile) {
/* Stop recording by disposing of the ExtAudioFile. */
ExtAudioFileDispose(extAudioFile);
extAudioFile = NULL;
}
NSUInteger numberOfFrames = [sampleBuffer numberOfSamples]; /* -[QTSampleBuffer numberOfSamples] corresponds to the number of CoreAudio audio frames. */
/* In order to render continuously, the effect audio unit needs a new time stamp for each buffer. Use the number of frames for each unit of time. */
currentSampleTime += (double)numberOfFrames;
AudioTimeStamp timeStamp = {0};
timeStamp.mSampleTime = currentSampleTime;
timeStamp.mFlags |= kAudioTimeStampSampleTimeValid; 
AudioUnitRenderActionFlags flags = 0;
/* Create an AudioBufferList large enough to hold the number of frames from the sample buffer in 32-bit floating point PCM format. */
AudioBufferList *outputABL = calloc(1, sizeof(*outputABL) + (currentInputASBD.mChannelsPerFrame - 1)*sizeof(outputABL->mBuffers[0]));
outputABL->mNumberBuffers = currentInputASBD.mChannelsPerFrame;
UInt32 channelIndex;
for (channelIndex = 0; channelIndex < currentInputASBD.mChannelsPerFrame; channelIndex++) {
UInt32 dataSize = numberOfFrames * currentInputASBD.mBytesPerFrame;
outputABL->mBuffers[channelIndex].mDataByteSize = dataSize;
outputABL->mBuffers[channelIndex].mData = malloc(dataSize);
outputABL->mBuffers[channelIndex].mNumberChannels = 1;
}
/*
Get an audio buffer list from the sample buffer and assign it to the currentInputAudioBufferList instance variable.
The the effect audio unit render callback, PushCurrentInputBufferIntoAudioUnit(), can access this value by calling the currentInputAudioBufferList method.
*/
currentInputAudioBufferList = [sampleBuffer audioBufferListWithOptions:QTSampleBufferAudioBufferListOptionAssure16ByteAlignment];
/* Tell the effect audio unit to render. This will synchronously call PushCurrentInputBufferIntoAudioUnit(), which will feed the audio buffer list into the effect audio unit. */
err = AudioUnitRender(effectAudioUnit, &flags, &timeStamp, 0, numberOfFrames, outputABL);
currentInputAudioBufferList = NULL;
if ((noErr == err) && extAudioFile) {
err = ExtAudioFileWriteAsync(extAudioFile, numberOfFrames, outputABL);
}
for (channelIndex = 0; channelIndex < currentInputASBD.mChannelsPerFrame; channelIndex++) {
free(outputABL->mBuffers[channelIndex].mData);
}
free(outputABL);
}
/* Used by PushCurrentInputBufferIntoAudioUnit() to access the current audio buffer list that has been output by the QTCaptureAudioDataOutput. */
- (AudioBufferList *)currentInputAudioBufferList
{
return currentInputAudioBufferList;
}
#pragma mark ======== Property and action definitions =========
@synthesize outputFile = outputFile;
@synthesize recording = recording;
- (IBAction)chooseOutputFile:(id)sender
{
NSSavePanel *savePanel = [NSSavePanel savePanel];
[savePanel setAllowedFileTypes:[NSArray arrayWithObject:@"aif"]];
[savePanel setCanSelectHiddenExtension:YES];
NSInteger result = [savePanel runModal];
if (NSOKButton == result) {
[self setOutputFile:[savePanel filename]];
}
}
@end
#pragma mark ======== AudioUnit render callback =========
/*
Synchronously called by the effect audio unit whenever AudioUnitRender() us called.
Used to feed the audio samples output by the ATCaptureAudioDataOutput to the AudioUnit.
*/
static OSStatus PushCurrentInputBufferIntoAudioUnit(void * inRefCon,
AudioUnitRenderActionFlags * ioActionFlags,
const AudioTimeStamp * inTimeStamp,
UInt32 inBusNumber,
UInt32 inNumberFrames,
AudioBufferList * ioData)
{
CaptureSessionController *self = (CaptureSessionController *)inRefCon;
AudioBufferList *currentInputAudioBufferList = [self currentInputAudioBufferList];
UInt32 bufferIndex, bufferCount = currentInputAudioBufferList->mNumberBuffers;
if (bufferCount != ioData->mNumberBuffers)
return badFormat;
/* Fill the provided AudioBufferList with the data from the AudioBufferList output by the audio data output. */
for (bufferIndex = 0; bufferIndex < bufferCount; bufferIndex++) {
ioData->mBuffers[bufferIndex].mDataByteSize = currentInputAudioBufferList->mBuffers[bufferIndex].mDataByteSize;
ioData->mBuffers[bufferIndex].mData = currentInputAudioBufferList->mBuffers[bufferIndex].mData;
ioData->mBuffers[bufferIndex].mNumberChannels = currentInputAudioBufferList->mBuffers[bufferIndex].mNumberChannels;
}
return noErr;
}

4. In your xib add a button for recording to start & stop.I have a NSButton in the contentView of my window. Your xib should like


Click for full-size image

5.You also need to set the bindings for the NSButton .For that select the button Bindings in your Attributes Inspector.


Thursday, April 7, 2011

Dropbox Integration in Iphone


Requirements:


1. You need the 4.0 version of the iPhone SDK. The version of your XCode should
be at least 3.2.3.
2. You need to have registered as a Dropbox application with mobile access at
http://dropbox.com/developers. You should have a consumer key and secret.
3. You need to download the dropbox sdk from https://www.dropbox.com/developers/releases

A. Adding DropboxSDK to your project


1. Open your project in XCode
2. Right-click on your project in the group tree in the left pane
3. Select Add -> Existing Files...
4. Navigate to where you uncompressed the Dropbox SDK and select the DropboxSDK
subfolder
5. Select "Copy items into destination group's folder"
6. Make sure "Recursively create groups for any added folders" is selected
7. Press Add button
8. Find the Frameworks folder in your app's group tree in the left pane
9. Make sure the framework Security.framework is added to your project
10. If not, right-click on Frameworks and select Add -> Existing Frameworks...
11. Select Security.framework from the list and select Add
12. Build your application. At this point you should have no build failures or
warning

B. Login successfully in your app


1. In your application delegate's application:didFinishLaunchingWithOptions:
method, add the following code:

DBSession* dbSession = [[[DBSession alloc] initWithConsumerKey:@"<YOUR CONSUMER KEY>" consumerSecret:@"<YOUR CONSUMER SECRET>"] autorelease];
[DBSession setSharedSession:dbSession];

Note: you will need to #import "DropboxSDK.h" at the top of this file

2. Somewhere in your app, add an event to launch the login controller, which
should look something like this:

- (void)didPressLink {
DBLoginController* controller = [[DBLoginController new] autorelease];
[controller presentFromController:self];
}

Note: you will need to #import "DropboxSDK.h" at the top of this file


C. Creating folder in your dropbox using your App


1. In your .m file add the below code,

@interface DropBoxViewController () < DBLoginControllerDelegate, DBRestClientDelegate>

@property (nonatomic, readonly) DBRestClient* restClient;
@end
#pragma mark -
#pragma mark DBLoginControllerDelegate methods
- (void)loginControllerDidLogin:(DBLoginController*)controller
{
restClient = [self restClient];
[restClient setDelegate:self];
[def setBool:YES forKey:@"userLoggedToDropboxAccnt"];
[NSUserDefaults resetStandardUserDefaults];
[restClient loadMetadata:@"" withHash:photosHash];
}
- (void)loginControllerDidCancel:(DBLoginController*)controller {
}
- (DBRestClient*)restClient {
if (restClient == nil) {
restClient = [[DBRestClient alloc] initWithSession:[DBSession sharedSession]];
restClient.delegate = self;
}
return restClient;
}
#pragma mark -
#pragma mark DBRestClientDelegate methods
- (void)restClient:(DBRestClient*)client loadedMetadata:(DBMetadata*)metadata {
[photosHash release];
photosHash = [metadata.hash retain];
NSMutableArray* newPhotoPaths = [NSMutableArray new];
for (DBMetadata* child in metadata.contents) {
[newPhotoPaths addObject:child.path];
}
[photoPaths release];
photoPaths = newPhotoPaths;
self.contentArray = photoPaths;
if([photoPaths containsObject:folderNmTxtField.text]){
}
else{
[restClient createFolder:folderNmTxtField.text];
}
}

photosHash is of type NSString defined in .h file
photoPaths is an NSArray defined in .h file

D.Uploading file in yur dropbox using your app


if restClient not initialized earlier, add the below code

restClient=[self restClient];
[restClient setDelegate:self];
[restClient loadMetadata:@"" withHash:photosHash];

for uploading,

[restClient uploadFile: filename toPath: (folder in which file is to be uploaded) fromPath: (path of the file to be uploaded);


Thursday, March 10, 2011

Movie file playback stored in Project Resources





Introduction


This blog entry adds methods to the UIViewController class for presenting and dismissing a movie player using a specific set of animations. The transitions used by these methods are the same ones used by the YouTube and iPod applications to display video content.


Steps :-


1. Add "MediaPlayer.framework " as existing framework

2. Make another viewController class, suppose we name it "MoviePlayerViewController"

3. In "MoviePlayerViewController.h", add the below code,

#import <UIKit/UIKit.h>
#import <MediaPlayer/MediaPlayer.h>
@interface MoviePlayerViewController : UIViewController 
{ 
MPMoviePlayerController *mp;
NSURL *movieURL;
}
- (id)initWithPath:(NSString *)moviePath;
- (void)readyPlayer;
- (void) changeTheViewToPortrait:(BOOL)portrait andDuration:(NSTimeInterval)duration;
@end
4. In "MoviePlayerViewController.m", add the below code,
#import "MoviePlayerViewController.h"
#pragma mark -
#pragma mark Compiler Directives & Static Variables
@implementation MoviePlayerViewController
-(void)viewWillDisappear:(BOOL) animated
{  
if(mp) {
[mp stop];
}
}
- (id)initWithPath:(NSString *)moviePath
{ 
mp.controlStyle =  MPMovieControlStyleNone;
// Initialize and create movie URL
if (self = [super init])
{
movieURL = [NSURL fileURLWithPath:moviePath];    
[movieURL retain];
}
return self;
}
- (void) moviePlayerLoadStateChanged:(NSNotification*)notification 
{   
[mp setControlStyle:MPMovieControlStyleFullscreen];
#if __IPHONE_OS_VERSION_MAX_ALLOWED >= 30200
NSString *deviceModel = [NSString stringWithFormat: @"Device Type: %@\n", [[UIDevice currentDevice] model]];    
NSRange range = [deviceModel rangeOfString:@"iPad"];
if(range.location != NSNotFound){
if ([mp loadState] != MPMovieLoadStateUnknown)
{
[[NSNotificationCenter     defaultCenter] removeObserver:self name:MPMoviePlayerLoadStateDidChangeNotification object:nil];
//view bounds are defined as par iPad screen, so adjust accordingly
[[self view] setBounds:CGRectMake(0, 0, 1024, 748)];
UIInterfaceOrientation interfaceOrientation = [self interfaceOrientation];
if(interfaceOrientation == UIInterfaceOrientationLandscapeLeft ){
[[mp view] setFrame:CGRectMake(0, 0, 1024, 748)];
}
else if(interfaceOrientation == UIInterfaceOrientationLandscapeRight ){
[[mp view] setFrame:CGRectMake(0, 0, 1024, 748)];
}
[[self view] addSubview:[mp view]];   
[mp play];
}
}
else
{  
if ([mp loadState] != MPMovieLoadStateUnknown)
{
[[NSNotificationCenter     defaultCenter] removeObserver:self name:MPMoviePlayerLoadStateDidChangeNotification object:nil];
UIInterfaceOrientation interfaceOrientation = [self interfaceOrientation];
if(interfaceOrientation == UIInterfaceOrientationLandscapeLeft || interfaceOrientation == UIInterfaceOrientationLandscapeRight){
[[UIApplication sharedApplication] setStatusBarOrientation:interfaceOrientation animated:NO];
}
[[UIApplication sharedApplication] setStatusBarOrientation:UIInterfaceOrientationLandscapeRight animated:NO];
[[self view] setBounds:CGRectMake(0, 0, 1024, 768)];
[[self view] setCenter:CGPointMake(160, 240)];
[[mp view] setFrame:CGRectMake(0, 0, 1024, 748)];
[[self view] setTransform:CGAffineTransformMakeRotation(M_PI / 2)];
[[self view] addSubview:[mp view]];   
[mp play];
}
}
#endif
}
/*---------------------------------------------------------------------------
* For 3.1.x devices
* For 3.2 and 4.x see moviePlayerLoadStateChanged: 
*--------------------------------------------------------------------------*/
- (void) moviePreloadDidFinish:(NSNotification*)notification 
{   
[[UIApplication sharedApplication] setStatusBarHidden:YES];
[[NSNotificationCenter defaultCenter] removeObserver:self name:MPMoviePlayerLoadStateDidChangeNotification object:nil];
[mp play];
}
/*---------------------------------------------------------------------------
* 
*--------------------------------------------------------------------------*/
- (void) moviePlayBackDidFinish:(NSNotification*)notification 
{    
[[UIApplication sharedApplication] setStatusBarHidden:YES];
[[NSNotificationCenter defaultCenter] removeObserver:self name:MPMoviePlayerPlaybackDidFinishNotification object:nil];
[self dismissModalViewControllerAnimated:NO];    
}
/*---------------------------------------------------------------------------
*
*--------------------------------------------------------------------------*/
- (void) readyPlayer
{   
[[UIApplication sharedApplication] setStatusBarHidden:YES];
mp =  [[MPMoviePlayerController alloc] initWithContentURL:movieURL];
mp.scalingMode= MPMovieScalingModeAspectFit;
if ([mp respondsToSelector:@selector(loadState)]) 
{
// Set movie player layout
#if __IPHONE_OS_VERSION_MAX_ALLOWED >= 30200 
//[mp setControlStyle:MPMovieControlStyleFullscreen];
[mp setControlStyle:MPMovieControlStyleNone];
[mp setFullscreen:YES];
#endif
// May help to reduce latency
[mp prepareToPlay];
// Register that the load state changed (movie is ready)
#if __IPHONE_OS_VERSION_MAX_ALLOWED >= 30200 
[[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(moviePlayerLoadStateChanged:) name:MPMoviePlayerLoadStateDidChangeNotification object:nil];
#endif
}  
else
{
// Register to receive a notification when the movie is in memory and ready to play.
[[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(moviePreloadDidFinish:) name:MPMoviePlayerContentPreloadDidFinishNotification object:nil];
//[[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(moviePreloadDidFinish:) name:MPMoviePlayerLoadStateDidChangeNotification object:nil];
}
// Register to receive a notification when the movie has finished playing. 
[[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(moviePlayBackDidFinish:) name:MPMoviePlayerPlaybackDidFinishNotification object:nil];
}
/*---------------------------------------------------------------------------
* 
*--------------------------------------------------------------------------*/
- (void) loadView
{ 
isFirstLoad = YES;
[self setView:[[[UIView alloc] initWithFrame:[[UIScreen mainScreen] applicationFrame]] autorelease]];
[[self view] setBackgroundColor:[UIColor blackColor]];
self.view.clipsToBounds = YES;
self.view.autoresizesSubviews = YES;
if(mp) {
[mp stop];
}
UIInterfaceOrientation interfaceOrientation = [self interfaceOrientation];
if(interfaceOrientation == UIInterfaceOrientationLandscapeLeft || interfaceOrientation == UIInterfaceOrientationLandscapeRight){
[[UIApplication sharedApplication] setStatusBarOrientation:interfaceOrientation animated:NO];
}        
}
//--------------------------------------------------------------------------------------------------------------------------------------------------------------------
- (void)dealloc 
{
[mp release];
[movieURL release];
[super dealloc];
}
//--------------------------------------------------------------------------------------------------------------------------------------------------------------------
#pragma mark -
#pragma mark InterfaceOrientationMethods
- (BOOL)shouldAutorotateToInterfaceOrientation:(UIInterfaceOrientation)interfaceOrientation {
return (interfaceOrientation == UIInterfaceOrientationLandscapeLeft ||
interfaceOrientation == UIInterfaceOrientationLandscapeRight);
}
//--------------------------------------------------------------------------------------------------------------------------------------------------------------------
- (void)willRotateToInterfaceOrientation:(UIInterfaceOrientation)toInterfaceOrientation duration:(NSTimeInterval)duration{
[super willRotateToInterfaceOrientation:toInterfaceOrientation duration:duration];
if(toInterfaceOrientation == UIInterfaceOrientationLandscapeLeft|| toInterfaceOrientation == UIInterfaceOrientationLandscapeRight){
//[self changeTheViewToPortrait:NO andDuration:duration];
}
}
//--------------------------------------------------------------------------------------------------------------------------------------------------------------------
- (void)didRotateFromInterfaceOrientation:(UIInterfaceOrientation)fromInterfaceOrientation{
if(fromInterfaceOrientation == UIInterfaceOrientationLandscapeLeft || fromInterfaceOrientation == UIInterfaceOrientationLandscapeRight){
[self changeTheViewToPortrait:NO andDuration:0.0];
}
}
//--------------------------------------------------------------------------------------------------------------------------------------------------------------------
- (void) changeTheViewToPortrait:(BOOL)portrait andDuration:(NSTimeInterval)duration{
[UIView beginAnimations:nil context:NULL];
[UIView setAnimationDuration:duration];
if(portrait){
}
else if (isFirstLoad){
isFirstLoad = FALSE;
UIInterfaceOrientation interfaceOrientation = [self interfaceOrientation];
if(interfaceOrientation == UIInterfaceOrientationLandscapeLeft ){
[[UIApplication sharedApplication] setStatusBarOrientation:UIInterfaceOrientationLandscapeLeft animated:NO];
[[self view] setBounds:CGRectMake(0, 0, 1024, 748)];
}
else if(interfaceOrientation == UIInterfaceOrientationLandscapeRight ){
NSLog(@"UIInterfaceOrientationLandscapeRight");
[[UIApplication sharedApplication] setStatusBarOrientation:UIInterfaceOrientationLandscapeRight animated:NO];
[[self view] setBounds:CGRectMake(0, 0, 1024, 748)];
}
}
[UIView commitAnimations];
}
//--------------------------------------------------------------------------------------------------------------------------------------------------------------------
@end

5. Now wherever , you want to play the movie file, add the below code,
NSString *movieFile=[[NSBundle mainBundle] pathForResource:@"test" ofType:@"mp4"];
    
 // Create custom movie player   
 moviePlayer = [[[MoviePlayerViewController alloc] initWithPath:movieFile] autorelease];
    
 [self presentModalViewController:moviePlayer animated:NO];
 [moviePlayer readyPlayer]; 
In this class, import "MoviePlayerViewController.h" & "moviePlayer" is a global object of MoviePlayerViewController