I downloaded WiTap Code From Apple's website. Its for transferring data over local wifi network. I am working in a project to interact as client - server architecture. I am sending NSData from client side to server.
I made 2 projects; one for client and one for server
At client side project, i made following Changes
For that I modified the AppController.m file by adding following method
AppController.m (Client side)
- (void)sendData:(NSData*)pobjData
{
assert(self.streamOpenCount == 2);
if ( [self.outputStream hasSpaceAvailable] )
{
NSInteger bytesWritten;
NSUInteger length = [pobjData length];
bytesWritten = [self.outputStream write:[pobjData bytes] maxLength:[pobjData length]];
NSLog(#"written bytes -> %d",bytesWritten);
}
}
Then by calling this method I send data.
At Server side project, I made following chagnes for that I modified the AppController.m file by modifying following method
AppController.m (Server side)
- (void)stream:(NSStream *)stream handleEvent:(NSStreamEvent)eventCode
{
#pragma unused(stream)
switch(eventCode) {
case NSStreamEventOpenCompleted: {
self.streamOpenCount += 1;
assert(self.streamOpenCount <= 2);
// Once both streams are open we hide the picker and the game is on.
if (self.streamOpenCount == 2) {
[self dismissPicker];
[self.server deregister];
}
} break;
case NSStreamEventHasSpaceAvailable: {
assert(stream == self.outputStream);
// do nothing
} break;
case NSStreamEventHasBytesAvailable:
{
if (stream == self.inputStream)
{
NSInteger bytesRead;
uint32_t buffer[32768];
NSMutableData *_data = [NSMutableData data];
// Pull some data off the network.
bytesRead = [self.inputStream read:buffer maxLength:sizeof(buffer)];
if (bytesRead == -1) {
} else if (bytesRead == 0) {
} else {
// FIXME: Popup an alert
const long long expectedContentLength = bytesRead;
NSUInteger expectedSize = 0;
// expectedContentLength can be represented as NSUInteger, so cast it:
expectedSize = (NSUInteger)expectedContentLength;
[_data appendBytes:buffer length:expectedSize];
NSLog(#"\"Data received has length: %d", _data.length);
[self performSelector:#selector(getData:) withObject:_data afterDelay:1.0];
}
}
}
break;
default:
assert(NO);
// fall through
case NSStreamEventErrorOccurred:
// fall through
case NSStreamEventEndEncountered: {
[self setupForNewGame];
} break;
}
}
and added a method to write the received data as a file
#define kUserDirectoryPath NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES)
-(void)getData:(NSMutableData *)pData
{
NSFileManager *tmpmanager = [NSFileManager defaultManager];
[tmpmanager createFileAtPath:[AppController getDocumentDirectoryPath:[NSString stringWithFormat:#"%#.png",[NSDate date]]] contents:pData attributes:nil];
}
+(NSString*)getDocumentDirectoryPath:(NSString*)pStrPathName
{
NSString *strPath=nil;
if(pStrPathName)
strPath = [[kUserDirectoryPath objectAtIndex:0] stringByAppendingPathComponent:pStrPathName];
return strPath;
}
I convert .png files to NSData and send them from client side to server side. the server downloads the file to Document Directory
The matter is , when i transfer file from client side , it gets downloaded to server side at document directory. Everything works fine in case of tiny files. If file size exceeds to 8kB , file written at document directory gets corrupted.
Kindly help me to be able to send large files.
The problem is that your code doesn't loop to gather all of the available data till the end (or loop to send all data either). So you only ever receive the first buffer of data. If the image is small then that works ok, if the image is bigger then it never will.
You need to write the code so that it keeps sending when there is buffer space until all data is sent and keep reading data (into an NSMutableData instance variable, not a local variable) until the end of the stream is reached.
You can use AsyncSocket which can be downloaded from
https://github.com/roustem/AsyncSocket ,
this is an objective-c wrapper build on CFSocket and CFNetwork, it can handle large amount of data transfer with TCP/UDP protocol on local wifi.
You can find the wiki here https://github.com/darkseed/cocoaasyncsocket/wiki/iPhone
The class is very simple and easy to implement.Give it a try
You have make a web service from where you need to put IP address of the system, Where you want to send the file and after that when you can connected with the entered IP address you can send the file in Base64 and NSData format.
Related
I am working on an APP for user to upload videos to our FTP server
So far, everything almost done but I met one issue is that after users upload videos(.MOV), I failed to open and play the files.
The error message that quicktime player returns is "can't open because the movie's file format is not recognized"
In my codes, I let users select videos by using ALAssetsLibrady
Then load the video into an ALAsset object, before start uploading, load the video into a NSInputStream object from ALAsset, here is the codes.
ALAssetRepresentation *rep = [currentAsset defaultRepresentation];
Byte *buffer = (Byte*)malloc(rep.size);
NSUInteger buffered = [rep getBytes:buffer fromOffset:0.0 length:rep.size error:nil];
NSData *data = [NSData dataWithBytesNoCopy:buffer length:buffered freeWhenDone:YES];
iStream = [NSInputStream inputStreamWithData:data];
[iStream open];
Next step is to set a NSOutputStream and open it, handle uploading operation by following codes.
- (void)stream:(NSStream *)aStream handleEvent:(NSStreamEvent)eventCode
{
switch (eventCode) {
case NSStreamEventNone:
{
break;
}
case NSStreamEventOpenCompleted:
{
//opened connection
NSLog(#"opened connection");
break;
}
case NSStreamEventHasBytesAvailable:
{
// should never happen for the output stream
[self stopSendWithStatus:#"should never happen for the output stream"];
break;
}
case NSStreamEventHasSpaceAvailable:
{
// If we don't have any data buffered, go read the next chunk of data.
NSInteger bufferSize = 65535;
uint8_t *buffer = malloc(bufferSize);
if (bufferOffset == bufferLimit) {
NSInteger bytesRead = [iStream read:buffer maxLength:bufferSize];
if (bytesRead == -1) {
[self stopSendWithStatus:#"file read error"];
} else if (bytesRead == 0) {
[self stopSendWithStatus:nil];
} else {
bufferOffset = 0;
bufferLimit = bytesRead;
}
}
// If we're not out of data completely, send the next chunk.
if (bufferOffset != bufferLimit) {
NSInteger bytesWritten = [oStream write:&buffer[bufferOffset] maxLength:bufferLimit - bufferOffset];
if (bytesWritten == -1) {
[self stopSendWithStatus:#"file write error"];
} else {
bufferOffset += bytesWritten;
}
}
//NSLog(#"available");
break;
}
case NSStreamEventErrorOccurred:
{
//stream open error
[self stopSendWithStatus:[[aStream streamError] description]];
break;
}
case NSStreamEventEndEncountered: //ignore
NSLog(#"end");
break;
}
}
There is no any error occurs, the video file does upload to FTP with correct file size and name, but just can't open it.
Anybody knows any clue?
I have made NSInputStream implementation for streaming ALAsset objects - POSInputStreamLibrary. It doesn't read the whole 1GB video into memory as your solution, but reads movie with chunks instead. Of course this is not the only feature of POSBlobInputStream. More info at my GitHub repository.
I know this probably isn't the answer you're looking for, but you should NOT use a direct connection via FTP to allow users to upload files to your webserver. It's unsecure and slow compared with REST.
Instead, why not write a tiny bit of php to handle the upload, and POST the file from the app via REST? here:
$uploaddir = 'uploads/';
$file = basename($_FILES['file']['name']);
$uploadfile = $uploaddir . $file;
I also recommend using AFNetworking to handle the POST request http://afnetworking.com/
First of all,I guess you meant to reduce memory capacity by convert ALAsset to NSInputStream other than NSData.But you convert it to NSData firstly then convert NSData you got to NSInputStream,it doesn't make sense and would not reduce memory capacity for you have already put your video into memory with NSData.
So if you want to transfer your video via Stream in order to reduce memory pressure(or you have no choice because your video is up to 2GB or more),you should use CFStreamCreateBoundPair to upload file chunk by chunk,see the Apple iOS Developer Library written below.
For large blocks of constructed data, call CFStreamCreateBoundPair to create a pair of streams, then call the setHTTPBodyStream: method to tell NSMutableURLRequest to use one of those streams as the source for its body content. By writing into the other stream, you can send the data a piece at a time.
I have a swift version of converting ALAsset to NSInputStream via CFStreamCreateBoundPair in github.The key point is just like the Documents written.Another reference is this question.
Hope it would be helpful for you.
I find a memory leak when I am testing my app on ios device, look at the code below:
- (void)_startReceive
// Starts a connection to download the current URL.
{
// Open a CFFTPStream for the URL.
CFReadStreamRef ftpStream = CFReadStreamCreateWithFTPURL(NULL, (CFURLRef) url);
assert(ftpStream != NULL);
self.networkStream = (NSInputStream *) ftpStream;
self.networkStream.delegate = self;
[self.networkStream scheduleInRunLoop:[NSRunLoop currentRunLoop] forMode:RUNLOOPMODEL];
[self.networkStream open];
CFRelease(ftpStream);
}
- (void)stream:(NSStream *)aStream handleEvent:(NSStreamEvent)eventCode
// An NSStream delegate callback that's called when events happen on our
// network stream.
{
if (self.networkStream == nil) { //EXC_BAD_ACCESS(code = 1,address=......)
NSLog(#"here");
}
switch (eventCode) {
case NSStreamEventOpenCompleted: {
} break;
case NSStreamEventHasBytesAvailable: {
NSInteger bytesRead;
uint8_t buffer[LISTDOCBUFFER];
......
}
I use this code to do a ftp request for document information. But only sometimes (one of eight times) the memory leak will happen at the line I note. And On testing on ios simulator, this never happened. I want to know the possible reason and how to fix it?
The reason could be anything but most likely to be invalid memory management. You can analyze your project in your XCode, go to the Project tab and select analyze where the memory leak is actually happening or you can run Profile from the same pathway to detect any particular memory leaks. Check out this link, it is a really cool topic on how to debug memory related issues.
After type casting ftpStream into NSInputStream you are releasing it (CFRelease(ftpStream)) and again using it if (self.networkStream == nil).Do not call CFRelease() on ftpStream and release NSInputStream once you are done with it.
I currently have a method that sends data through Bonjour. The problem is that my data is limited to 1000 kB. I know that if I want to send larger data I need to break it in to packets.
But this raises a question of how do I prevent packets from being lost, and ensure all packets are received by the receiver.
I am not good with network and would like to ask you to help me change this simple method to enable larger data transfer.
- (BOOL)sendData:(NSData *)data error:(NSError **)error {
BOOL successful = NO;
if(self.outputStreamHasSpace) {
NSInteger len = [self.outputStream write:[data bytes] maxLength:[data length]];
if(-1 == len) {
// error occured
*error = [[NSError alloc]
initWithDomain:ServerErrorDomain
code:kServerNoSpaceOnOutputStream
userInfo:[[self.outputStream streamError] userInfo]];
} else if(0 == len) {
// stream has reached capacity
*error = [[NSError alloc]
initWithDomain:ServerErrorDomain
code:kServerOutputStreamReachedCapacity
userInfo:[[self.outputStream streamError] userInfo]];
} else {
successful = YES;
}
} else {
*error = [[NSError alloc] initWithDomain:ServerErrorDomain
code:kServerNoSpaceOnOutputStream
userInfo:nil];
}
return successful;
}
Thank you.
You're not sending them over 'Bonjour', you're sending UDP packets to a multicast address. On most networks the maximum frame size is 1500 bytes. Realistically, allowing for headers, vlan tags, etc, you have about 1.3 - 1.4k of data per frame to fill. As the data's going over UDP, controlling the correct reception and ordering of packets is up to you- it's one of the drawbacks of not using TCP ;)
I'm trying to send files over a bluetooth connection.
I've got this to work thanks to my previous post, but the method wasn't memory efficient.
The whole file was loaded into memory before it was sent, and this created problems (and crashed the app) for files > ~20 MB. So I've come up with a new method of only reading parts of the file I need at a specific time, creating packets from the data, sending them and repeating the process for each 8KB chunk of the file.
So I made a class method that generates the packet, informs the controller that a packet is available (through a protocol) and repeats this process for each packet that's available.
Here's the code for the packet generator:
+ (void)makePacketsFromFile:(NSString *)path withDelegate:(id <filePacketDelegate>)aDelegate {
if (![[NSFileManager defaultManager] fileExistsAtPath:path] || aDelegate == nil) return;
id <filePacketDelegate> delegate;
delegate = aDelegate;
const NSUInteger quanta = 8192;
uint filesize;
NSDictionary *fileAttributes = [[NSFileManager defaultManager] attributesOfItemAtPath:path error:nil];
filesize = [[fileAttributes objectForKey:NSFileSize] intValue];
int numOfPackets = (int)ceil(filesize/quanta);
if (numOfPackets == 0) numOfPackets = 1;
NSLog(#"filesize = %d, numOfPackets = %d or %.3f", filesize, numOfPackets, (float)ceil(filesize/quanta));
NSFileHandle *handle = [NSFileHandle fileHandleForReadingAtPath:path];
int offset = 0;
int counter = 0;
while (counter != (numOfPackets + 1)) {
uint len = (filesize < quanta) ? filesize : quanta;
if (counter == numOfPackets) {
len = filesize - offset;
}
[handle seekToFileOffset:offset];
NSData *fileData = [handle readDataOfLength:len];
file_packet *packet = [[file_packet alloc] initWithFileName:[path lastPathComponent] ofType:0 index:counter];
packet.packetContents = fileData;
[fileData release];
packet.checksum = #"<to be done>";
packet.numberOfPackets = [NSString stringWithFormat:#"%d", numOfPackets];
[delegate packetIsReadyForSending:packet];
[packet release];
offset += quanta;
counter++;
}
[handle closeFile];
}
And receiving and sending the file:
- (void)packetIsReadyForSending:(file_packet *)packet {
NSData *fileData = [packet dataForSending];
[self.connectionSession sendDataToAllPeers:fileData withDataMode:GKSendDataReliable error:nil];
}
- (void)sendFileViaBluetooth {
[file_packet makePacketsFromFile:selectedFilePath withDelegate:self];
}
However, the memory use is quite large. Not what I expected.
I'm a bit stuck on this, as I wouldn't like to restrict bluetooth sharing to files smaller than 20MB.
Any help appreciated.
Edit:
I've been thinking about this for a while and have come to the conclusion that it's not my code that's causing the memory allocation issue, it's GameKit's stack for sending the packets.
I think I'm generating too many packets too fast, and I don't think GameKit is sending them quick enough.
So I am now thinking about a way to see when GameKit has sent a packet, and only generating another one once GameKit confirms it was sent.
There’s one clear memory management issue in your code:
NSData *fileData = [handle readDataOfLength:len];
fileData wasn’t obtained via NARC (a method whose name contains new, alloc, retain, copy), so you don’t own it, hence you don’t release it.
[fileData release];
Oops.
As for the memory use, one thing to consider is that even though you release packet you can’t really tell what’s going on inside -[GKSession sendDataToAllPeers:withDataMode:error:]. It is possible that the method internally creates autoreleased objects that only end up being released after +makePacketsFromFile:withDelegate: has finished executing. I suggest you use a new autorelease pool in every loop iteration:
while (counter != (numOfPackets + 1)) {
NSAutoreleasePool *pool = [NSAutoreleasePool new];
uint len = (filesize < quanta) ? filesize : quanta;
…
counter++;
[pool drain];
}
By doing this, any autoreleased object inside that loop will be effectively released at the end of each loop iteration.
Checkout the NSInputStream. With it you can open a stream and only read out a buffer of bytes at a time.
I was wondering how to access an MPMediaItem's raw data.
Any ideas?
you can obtain the media item's data in such way:
-(void)mediaItemToData
{
// Implement in your project the media item picker
MPMediaItem *curItem = musicPlayer.nowPlayingItem;
NSURL *url = [curItem valueForProperty: MPMediaItemPropertyAssetURL];
AVURLAsset *songAsset = [AVURLAsset URLAssetWithURL: url options:nil];
AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset: songAsset
presetName: AVAssetExportPresetPassthrough];
exporter.outputFileType = #"public.mpeg-4";
NSString *exportFile = [[self myDocumentsDirectory] stringByAppendingPathComponent:
#"exported.mp4"];
NSURL *exportURL = [[NSURL fileURLWithPath:exportFile] retain];
exporter.outputURL = exportURL;
// do the export
// (completion handler block omitted)
[exporter exportAsynchronouslyWithCompletionHandler:
^{
NSData *data = [NSData dataWithContentsOfFile: [[self myDocumentsDirectory]
stringByAppendingPathComponent: #"exported.mp4"]];
// Do with data something
}];
}
This code will work only on ios 4.0 and later
Good luck!
Of course you can access the data of a MPMediaItem. It's not crystal clear at once but it works. Here's how:
Get the media item's URL from it's MPMediaItemPropertyAssetURL property
Initialize an AVURLAsset with this URL
Initialize an AVAssetReader with this asset
Fetch the AVAssetTrack you want to read from the AVURLAsset
Create an AVAssetReaderTrackOutput with this track
Add this output to the AVAssetReader created before and -startReading
Fetch all data with AVAssetReaderTrackOutput's -copyNextSampleBuffer
PROFIT!
Here is some sample code from a project of mine (this is not a code jewel of mine, wrote it some time back in my coding dark ages):
typedef enum {
kEDSupportedMediaTypeAAC = 'aac ',
kEDSupportedMediaTypeMP3 = '.mp3'
} EDSupportedMediaType;
- (EDLibraryAssetReaderStatus)prepareAsset {
// Get the AVURLAsset
AVURLAsset *uasset = [m_asset URLAsset];
// Check for DRM protected content
if (uasset.hasProtectedContent) {
return kEDLibraryAssetReader_TrackIsDRMProtected;
}
if ([uasset tracks] == 0) {
DDLogError(#"no asset tracks found");
return AVAssetReaderStatusFailed;
}
// Initialize a reader with a track output
NSError *err = noErr;
m_reader = [[AVAssetReader alloc] initWithAsset:uasset error:&err];
if (!m_reader || err) {
DDLogError(#"could not create asset reader (%i)\n", [err code]);
return AVAssetReaderStatusFailed;
}
// Check tracks for valid format. Currently we only support all MP3 and AAC types, WAV and AIFF is too large to handle
for (AVAssetTrack *track in uasset.tracks) {
NSArray *formats = track.formatDescriptions;
for (int i=0; i<[formats count]; i++) {
CMFormatDescriptionRef format = (CMFormatDescriptionRef)[formats objectAtIndex:i];
// Check the format types
CMMediaType mediaType = CMFormatDescriptionGetMediaType(format);
FourCharCode mediaSubType = CMFormatDescriptionGetMediaSubType(format);
DDLogVerbose(#"mediaType: %s, mediaSubType: %s", COFcc(mediaType), COFcc(mediaSubType));
if (mediaType == kCMMediaType_Audio) {
if (mediaSubType == kEDSupportedMediaTypeAAC ||
mediaSubType == kEDSupportedMediaTypeMP3) {
m_track = [track retain];
m_format = CFRetain(format);
break;
}
}
}
if (m_track != nil && m_format != NULL) {
break;
}
}
if (m_track == nil || m_format == NULL) {
return kEDLibraryAssetReader_UnsupportedFormat;
}
// Create an output for the found track
m_output = [[AVAssetReaderTrackOutput alloc] initWithTrack:m_track outputSettings:nil];
[m_reader addOutput:m_output];
// Start reading
if (![m_reader startReading]) {
DDLogError(#"could not start reading asset");
return kEDLibraryAssetReader_CouldNotStartReading;
}
return 0;
}
- (OSStatus)copyNextSampleBufferRepresentation:(CMSampleBufferRepresentationRef *)repOut {
pthread_mutex_lock(&m_mtx);
OSStatus err = noErr;
AVAssetReaderStatus status = m_reader.status;
if (m_invalid) {
pthread_mutex_unlock(&m_mtx);
return kEDLibraryAssetReader_Invalidated;
}
else if (status != AVAssetReaderStatusReading) {
pthread_mutex_unlock(&m_mtx);
return kEDLibraryAssetReader_NoMoreSampleBuffers;
}
// Read the next sample buffer
CMSampleBufferRef sbuf = [m_output copyNextSampleBuffer];
if (sbuf == NULL) {
pthread_mutex_unlock(&m_mtx);
return kEDLibraryAssetReader_NoMoreSampleBuffers;
}
CMSampleBufferRepresentationRef srep = CMSampleBufferRepresentationCreateWithSampleBuffer(sbuf);
if (srep && repOut != NULL) {
*repOut = srep;
}
else {
DDLogError(#"CMSampleBufferRef corrupted");
EDCFShow(sbuf);
err = kEDLibraryAssetReader_BufferCorrupted;
}
CFRelease(sbuf);
pthread_mutex_unlock(&m_mtx);
return err;
}
You can't, and there are no workaround. An MPMediaItem is not the actual piece of media, it is just the metadata about the media item communicated to the application via RPC from another process. The data for the item itself is not accessible in your address space.
I should note that even if you have the MPMediaItem its data probably is not loaded into the devices memory. The flash on the iPhone is slow and memory is scarce. While Apple may not want you to have access to the raw data backing an MPMediaItem, it is just as likely that they didn't bother dealing with it because they didn't want to invest the time necessary to deal with the APIs. If they did provide access to such a thing it almost certainly would not be as an NSData, but more likely as an NSURL they would give your application that would allow it to open the file and stream through the data.
In any event, if you want the functionality, you should file a bug report asking for.
Also, as a side note, don't mention your age in a bug report you send to Apple. I think it is very cool you are writing apps for the phone, when I was your age I loved experimenting with computers (back then I was working on things written in Lisp). The thing is you cannot legally agree to a contract in the United States, which is why the developer agreement specifically prohibits you from joining. From the first paragraph of the agreement:
You also certify that you are of the
legal age of majority in the
jurisdiction in which you reside (at
least 18 years of age in many
countries) and you represent that you
are legally permitted to become a
Registered iPhone Developer.
If you mention to a WWDR representative that you are not of age of majority they may realize you are in violation of the agreement and be obligated to terminate your developer account. Just a friendly warning.