There is no sound with iFrameExtractor player - ios5

I has built ffmpeg and iFrameExtractor with ios5.1 successful, But when I play the video, There is no sound
// Register all formats and codecs
avcodec_register_all();
av_register_all();
avformat_network_init();
if(avformat_open_input(&pFormatCtx, [#"http://somesite.com/test.mp4" cStringUsingEncoding:NSASCIIStringEncoding], NULL, NULL) != 0) {
av_log(NULL, AV_LOG_ERROR, "Couldn't open file\n");
goto initError;
}
The log is
[swscaler # 0xdd3000] No accelerated colorspace conversion found from
yuv420p to rgb24. 2012-10-22 20:42:47.344 iFrameExtractor[356:707]
video duration: 5102.840000 2012-10-22 20:42:47.412
iFrameExtractor[356:707] video size: 720 x 576 2012-10-22 20:42:47.454
iFrameExtractor[356:707] Application windows are expected to have a
root view
This is my configure file for ffmpeg 0.11.1:
#!/bin/tcsh -f
rm -rf compiled/*
./configure \
--cc=/Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/usr/bin/gcc \
--as='/usr/local/bin/gas-preprocessor.pl /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/usr/bin/gcc' \
--sysroot=/Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS5.1.sdk \
--target-os=darwin \
--arch=arm \
--cpu=cortex-a8 \
--extra-cflags='-arch armv7' \
--extra-ldflags='-arch armv7 -isysroot /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS5.1.sdk' \
--prefix=compiled/armv7 \
--enable-cross-compile \
--enable-nonfree \
--disable-armv5te \
--disable-swscale-alpha \
--disable-doc \
--disable-ffmpeg \
--disable-ffplay \
--disable-ffprobe \
--disable-ffserver \
--enable-decoder=h264 \
--enable-decoder=svq3 \
--disable-asm \
--disable-bzlib \
--disable-gpl \
--disable-shared \
--enable-static \
--disable-mmx \
--disable-neon \
--disable-decoders \
--disable-muxers \
--disable-demuxers \
--disable-devices \
--disable-parsers \
--disable-encoders \
--enable-protocols \
--disable-filters \
--disable-bsfs \
--disable-postproc \
--disable-debug

There is not enough information here.
What url are you trying to open for instance?
Where there messages in the log. I know using version .11 you get a few warnings about you not including the network_init, but that wouldn't stop it from working.
Some things have changed that worked in previous versions ie. you used to be able to append ?tcp to specify ffmpeg is using tcp but now it must be done in the dictionary.
Please provide both the syslog and build logs if possilbe.
Here's an example from one of our apps
avcodec_register_all();
avdevice_register_all();
av_register_all();
avformat_network_init();
const char *filename = [url UTF8String];
NSLog(#"filename = %#" ,url);
// err = av_open_input_file(&avfContext, filename, NULL, 0, NULL);
AVDictionary *opts = 0;
if (usesTcp) {
av_dict_set(&opts, "rtsp_transport", "tcp", 0);
}
err = avformat_open_input(&avfContext, filename, NULL, &opts);
av_dict_free(&opts);
if (err) {
NSLog(#"Error: Could not open stream: %d", err);
return nil;
}
else {
NSLog(#"Opened stream");
}

So assuming you do have a block of code like the following what do you do with the audio, you have to use one of the audio api's to process it, audioQueues probably would be the easiest if your dealing mostly with known types.
First in your initialization get the audio info from the stream
// Retrieve stream information
if(av_find_stream_info(pFormatCtx)<0)
return ; // Couldn't find stream information
// Find the first video stream
videoStream=-1;
for(int i=0; i<pFormatCtx->nb_streams; i++) {
if(pFormatCtx->streams[i]->codec->codec_type==AVMEDIA_TYPE_VIDEO)
{
videoStream=i;
}
if(pFormatCtx->streams[i]->codec->codec_type==AVMEDIA_TYPE_AUDIO)
{
audioStream=i;
NSLog(#"found audio stream");
}
}
Then later in your processing loop do something like this.
while(!frameFinished && av_read_frame(pFormatCtx, &packet)>=0) {
// Is this a packet from the video stream?
if(packet.stream_index==videoStream) {
// Decode video frame
//do something with the video.
}
if(packet.stream_index==audioStream) {
// NSLog(#"audio stream");
//do something with the audio packet, here we simply add it to a processing
queue to be handled by another thread.
[audioPacketQueueLock lock];
audioPacketQueueSize += packet.size;
[audioPacketQueue addObject:[NSMutableData dataWithBytes:&packet length:sizeof(packet)]];
[audioPacketQueueLock unlock];
To play the audio you can look at this for some examples
https://github.com/mooncatventures-group/FFPlayer-beta1/blob/master/FFAVFrames-test/AudioController.m

Related

protoc-gen-grpc-swift: Plugin killed by signal 9

I run the following command
protoc ./Sources/Protos/echo.proto \
--proto_path=./Sources/Protos/ \
--plugin=/usr/local/bin/protoc-gen-swift \
--swift_opt=Visibility=Public \
--swift_out=Sources/Protos/ \
--plugin=/usr/local/bin/protoc-gen-grpc-swift \
--grpc-swift_opt=Visibility=Public,AsyncClient=True,AsyncServer=True \
--grpc-swift_out=./Sources/Protos/
The protocol is:
syntax = "proto3";
package echo;
service EchoService {
rpc echo(EchoRequest) returns (EchoResponse);
}
message EchoRequest {
string contents = 1;
}
message EchoResponse {
string contents = 1;
}
And receive the error:
--grpc-swift_out: protoc-gen-grpc-swift: Plugin killed by signal 9.
I am running tag 1.7.1-async-await.2.
This was generating the output files under 1.8.1 but I would like the async-await version. This fails even if the async flags are False. I have verified that the plugins in /usr/local/bin are the ones generated by make plugins in 1.7.1-async-await.2.

Spark stream stops abruptly - "the specified path does not exist"

I am working on the spark structure streaming. My Stream works fine but after sometime it just stops because of below issue.
Any suggestion what could be the reason and how to resolve this issue.
java.io.FileNotFoundException: Operation failed: "The specified path does not exist.", 404, GET, https://XXXXXXXX.dfs.core.windows.net/output?upn=false&resource=filesystem&maxResults=5000&directory=XXXXXXXX&timeout=90&recursive=true, PathNotFound, "The specified path does not exist. RequestId:d1b7c77f-e01f-0027-7f09-4646f7000000 Time:2022-04-01T20:47:30.1791444Z"
at shaded.databricks.azurebfs.org.apache.hadoop.fs.azurebfs.AzureBlobFileSystem.checkException(AzureBlobFileSystem.java:1290)
at shaded.databricks.azurebfs.org.apache.hadoop.fs.azurebfs.AzureBlobFileSystem.listKeysWithPrefix(AzureBlobFileSystem.java:530)
at com.databricks.tahoe.store.EnhancedAzureBlobFileSystemUpgrade.listKeysWithPrefix(EnhancedFileSystem.scala:605)
at com.databricks.tahoe.store.EnhancedDatabricksFileSystemV2.$anonfun$listKeysWithPrefix$1(EnhancedFileSystem.scala:374)
at com.databricks.backend.daemon.data.client.DBFSV2.$anonfun$listKeysWithPrefix$1(DatabricksFileSystemV2.scala:247)
at com.databricks.logging.UsageLogging.$anonfun$recordOperation$1(UsageLogging.scala:395)
at com.databricks.logging.UsageLogging.executeThunkAndCaptureResultTags$1(UsageLogging.scala:484)
at com.databricks.logging.UsageLogging.$anonfun$recordOperationWithResultTags$4(UsageLogging.scala:504)
at com.databricks.logging.UsageLogging.$anonfun$withAttributionContext$1(UsageLogging.scala:266)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)
at com.databricks.logging.UsageLogging.withAttributionContext(UsageLogging.scala:261)
at com.databricks.logging.UsageLogging.withAttributionContext$(UsageLogging.scala:258)
at com.databricks.backend.daemon.data.client.DatabricksFileSystemV2.withAttributionContext(DatabricksFileSystemV2.scala:510)
at com.databricks.logging.UsageLogging.withAttributionTags(UsageLogging.scala:305)
at com.databricks.logging.UsageLogging.withAttributionTags$(UsageLogging.scala:297)
at com.databricks.backend.daemon.data.client.DatabricksFileSystemV2.withAttributionTags(DatabricksFileSystemV2.scala:510)
at com.databricks.logging.UsageLogging.recordOperationWithResultTags(UsageLogging.scala:479)
at com.databricks.logging.UsageLogging.recordOperationWithResultTags$(UsageLogging.scala:404)
at com.databricks.backend.daemon.data.client.DatabricksFileSystemV2.recordOperationWithResultTags(DatabricksFileSystemV2.scala:510)
at com.databricks.logging.UsageLogging.recordOperation(UsageLogging.scala:395)
at com.databricks.logging.UsageLogging.recordOperation$(UsageLogging.scala:367)
at com.databricks.backend.daemon.data.client.DatabricksFileSystemV2.recordOperation(DatabricksFileSystemV2.scala:510)
at com.databricks.backend.daemon.data.client.DBFSV2.listKeysWithPrefix(DatabricksFileSystemV2.scala:240)
at com.databricks.tahoe.store.EnhancedDatabricksFileSystemV2.listKeysWithPrefix(EnhancedFileSystem.scala:374)
at com.databricks.tahoe.store.AzureLogStore.listKeysWithPrefix(AzureLogStore.scala:54)
at com.databricks.tahoe.store.DelegatingLogStore.listKeysWithPrefix(DelegatingLogStore.scala:251)
at com.databricks.sql.fileNotification.autoIngest.FileEventBackfiller$.listFiles(FileEventWorkerThread.scala:967)
at com.databricks.sql.fileNotification.autoIngest.FileEventBackfiller.runInternal(FileEventWorkerThread.scala:876)
at com.databricks.sql.fileNotification.autoIngest.FileEventBackfiller.run(FileEventWorkerThread.scala:809)
Caused by: Operation failed: "The specified path does not exist.", 404, GET, https://XXXXXXXXXX.dfs.core.windows.net/output?upn=false&resource=filesystem&maxResults=5000&directory=XXXXXXXX&timeout=90&recursive=true, PathNotFound, "The specified path does not exist. RequestId:02ae07cf-901f-0001-080e-46dd43000000 Time:2022-04-01T21:21:40.2136657Z"
at shaded.databricks.azurebfs.org.apache.hadoop.fs.azurebfs.services.AbfsRestOperation.execute(AbfsRestOperation.java:241)
at shaded.databricks.azurebfs.org.apache.hadoop.fs.azurebfs.services.AbfsClient.listPath(AbfsClient.java:235)
at shaded.databricks.azurebfs.org.apache.hadoop.fs.azurebfs.AzureBlobFileSystemStore.listFiles(AzureBlobFileSystemStore.java:1112)
at shaded.databricks.azurebfs.org.apache.hadoop.fs.azurebfs.AzureBlobFileSystemStore.access$200(AzureBlobFileSystemStore.java:143)
at shaded.databricks.azurebfs.org.apache.hadoop.fs.azurebfs.AzureBlobFileSystemStore$1.fetchMoreResults(AzureBlobFileSystemStore.java:1052)
at shaded.databricks.azurebfs.org.apache.hadoop.fs.azurebfs.AzureBlobFileSystemStore$1.(AzureBlobFileSystemStore.java:1033)
at shaded.databricks.azurebfs.org.apache.hadoop.fs.azurebfs.AzureBlobFileSystemStore.listKeysWithPrefix(AzureBlobFileSystemStore.java:1029)
at shaded.databricks.azurebfs.org.apache.hadoop.fs.azurebfs.AzureBlobFileSystem.listKeysWithPrefix(AzureBlobFileSystem.java:527)
... 27 more
Below is my code:
from pyspark.sql.functions import *
from pyspark.sql.types import *
from pyspark.sql.types import StructType, StringType
from pyspark.sql import functions as F
from delta.tables import *
spark.sql("set spark.sql.files.ignoreMissingFiles=true")
filteredRawDF = ""
try:
filteredRawDF = spark.readStream.format("cloudFiles") \
.option("cloudFiles.format", "json") \
.option("cloudFiles.schemaLocation", landingcheckPointFilePath) \
.option("cloudFiles.inferColumnTypes", "true") \
.load(landingFilePath) \
.select(from_json('body', schema).alias('temp')) \
.select(explode("temp.report.data").alias("details")) \
.select("details",
explode("details.breakdown").alias("inner_breakdown")) \
.select("details","inner_breakdown",
explode("inner_breakdown.breakdown").alias("outer_breakdown"))\
.select(to_timestamp(col("details.name"), "yyyy-MM-
dd'T'HH:mm:ss+SSSS").alias('datetime'),
col("details.year"),
col("details.day"),
col("details.hour"),
col("details.minute"),
col("inner_breakdown.name").alias("hotelName"),
col("outer_breakdown.name").alias("checkindate"),
col("outer_breakdown.counts")[0].cast("int").alias("HdpHits"))
except Exception as e:
print(e)
query = filteredRawDF \
.writeStream \
.format("delta") \
.option("mergeSchema", "true") \
.outputMode("append") \
.option("checkpointLocation", checkPointPath) \
.trigger(processingTime='50 seconds') \
.start(savePath) '''
Thanks

Merging Audio file with Video file in Flutter

I am working on an app which has similar functionality like Instagram reels. I want to know how do I merge a song(audio) while recording the video and then store them both as a video.
We will do using flutter_ffmpeg package.
create a command like this.
command = "-y -i $videoPath -i $audioPath -map 0:v -map 1:a -c:v copy "
"-shortest $savedFileLocation"
to execute command using
final FlutterFFmpeg _flutterFFmpeg = FlutterFFmpeg();
_flutterFFmpeg.execute(command.string).then((rc) {
statusCode = rc;
print("FFmpeg process exited with rc $rc");
return statusCode;
});

Download file using Wget yammer export data

I create console application that export data from yammer to local using wget
third party tool
and this is the reference
https://developer.yammer.com/docs/data-export-api
the function that execute script:
internal static bool ExecuteScript()
{
try
{
ProcessStartInfo startInfo = new ProcessStartInfo("cmd.exe");
Process p = new Process();
startInfo.RedirectStandardInput = true;
startInfo.UseShellExecute = false;
startInfo.RedirectStandardOutput = true;
startInfo.RedirectStandardError = true;
p = Process.Start(startInfo);
p.StandardInput.WriteLine("wget -O export.zip -t 1 --header \"Authorization: Bearer %Token%\" -ca-certificate cacert.pem https://www.yammer.com/api/v1/export?since=2016-02-09T00:00:00z");
p.StandardInput.WriteLine(#"exit");
string output = p.StandardOutput.ReadToEnd();
string error = p.StandardError.ReadToEnd();
p.WaitForExit();
p.Close();
Console.WriteLine("Error:" + error);
return true;
}
catch (Exception ex)
{
throw ex;
}
}
i replace %Token% with my token
but when run the code it cut off download and create export.zip file 0KB
it's not download complete file
it show this message in console
Console application output
although i take this script in batch file and run it from cmd in same path it download complete file
notes:
1- I add Wget path to Path Environment
2- I'm using windows 10
3- I'm using VS 2013
i discover the issue
p.StandardInput.WriteLine("wget -O export.zip -t 1 --header \"Authorization: Bearer <Access Token>\" --ca-certificate=cacert.pem cacert.pem https://www.yammer.com/api/v1/export?since=2016-02-09T00:00:00z");

sphinx CJK support

We are using Sphinx with MYSQL. So our MYSQL is utf, and has Chinese characters and we need Sphinx to support CJK. Here's what we have in sphinx.conf:
charset_type = utf-8
charset_table = 0..9, U+27, U+41..U+5a->U+61..U+7a, U+61..U+7a, \
U+aa, U+b5, U+ba, \
U+c0..U+d6->U+e0..U+f6, U+d8..U+de->U+f8..U+fe, U+df..U+f6, \
U+f8..U+ff, U+100..U+12f/2, U+130->U+69, \
U+131, U+132..U+137/2, U+138, \
...
...
...
ngram_chars = U+3400..U+4DB5, U+4E00..U+9FA5, U+20000..U+2A6D6,U+4E00..U+9FBB, U+3400..U+4DB5, U+20000..U+2A6D6, U+FA0E, U+FA0F, U+FA11, U+FA13, U+FA14, U+FA1F, U+FA21, U+FA23, U+FA24, U+FA27, U+FA28, U+FA29, U+3105..U+312C, U+31A0..U+31B7, U+3041, \
U+3043, U+3045, U+3047, U+3049, U+304B, U+304D, U+304F, U+3051, U+3053, U+3055, U+3057, U+3059, U+305B, U+305D, U+305F, U+3061, U+3063, U+3066, U+3068, U+306A..U+306F, U+3072, U+3075, U+3078, U+307B, U+307E..U+3083, U+3085, U+3087, U+3089..U+308E, U+3090..U+3093, \
U+30A1, U+30A3, U+30A5, U+30A7, U+30A9, U+30AD, U+30AF, U+30B3, U+30B5, U+30BB, U+30BD, U+30BF, U+30C1, U+30C3, U+30C4, U+30C6, U+30CA, U+30CB, U+30CD, U+30CE, U+30DE, U+30DF, U+30E1, U+30E2, U+30E3, U+30E5, U+30E7, U+30EE, U+30F0..U+30F3, U+30F5, U+30F6, U+31F0, \
U+31F1, U+31F2, U+31F3, U+31F4, U+31F5, U+31F6, U+31F7, U+31F8, U+31F9, U+31FA, U+31FB, U+31FC, U+31FD, U+31FE, U+31FF, U+AC00..U+D7A3, U+1100..U+1159, U+1161..U+11A2, U+11A8..U+11F9, U+A000..U+A48C, U+A492..U+A4C6
ngram_len = 1
And mysql conf:
character_set_client:utf8
character_set_connection:utf8
character_set_database:utf8 character_set_results:utf8 character_set_server:utf8 character_set_system:utf8 collation_connection:utf8_general_ci collation_database:utf8_general_ci collation_server:utf8_general_ci init_connect:SET NAMES utf8
It manage to index weird characters such as this as Chinese: 今宵离别åŽä½•æ—¥å›å†æ¥
And real chinese like this it's showing up as ??? in sphinx: 后来
My believe is there's some encoding problem but I don't know where.