IOS Charts - Values on X-Axis not formatted correctly in portrait mode - swift

I'm in the process of integrating "IOS Charts" into my latest app. Everything is fine but formatting of XAxis isn't running properly in portrait mode on IOS (see Screenshot). Landscape mode is fine. Chartsdata contain 290 entries with date / numeric value. Some formatting is obviously done but not enough. (see screenshots)
I am working with Xcode 8 / Swift 3 / latest build from CocoaPods (version 3.0.0 of Charts) / IOS 10
Here is my only code relating to Xaxis, where "Charty" is my LineChartView
Charty.xAxis.granularityEnabled = true
Charty.xAxis.granularity = 10
Charty.xAxis.labelPosition = .bottom
Charty.autoScaleMinMaxEnabled = true

Try below code while it is in portrait mode.
lineChartView.xAxis.spaceBetweenLabels = 4
lineChartView.xAxis.wordWrapEnabled = false
spaceBetweenLabels will help you to align things on xAxis.

Related

iOS 16 Poor SpeechSynthesisUtterance voice quality

since iOS 16 update my vocabulary app (PWA) has problems with spelling provided text to SpeechSynthesisUtterance object. It doesn't apply to all languages, eg. Russian sounds the same like before update to iOS 16. If it comes to German or English - the quality is very low, muffled, the voice sounds nasal... For MacOS Safari everything works as supposed to, but not for iOS 16.
const fullPhrase = toFullPhrase(props.phrase);
const utterance = new SpeechSynthesisUtterance();
onMounted(() => { // Vue lifecycle method
utterance.text = fullPhrase;
utterance.lang = voice.value.lang;
utterance.voice = voice.value;
utterance.addEventListener(ON_SPEAK_END, toggleSpeakStatus);
});
I tried to modify pitch and rate properties but without success... Did they change API for SpeechSynthesis / SpeechSynthesisUtterance for Safari in iOS 16 maybe?
It looks like IO16 introduced a lot of new (sometimes very weird) voices for en-GB and en-US. In my case I was looking for a voice only by a lang and taking the first one. As a result I was getting a strange voice.

Swift SF Symbols - only available in iOS 13.0 or newer error

Been playing around with the SF Symbols and very impressive they are too.
However when I try and use them in code from a number of SF examples I come up with:
let image = UIImage(systemName: "hands.clap"). < init(systemName:) is only available in iOS 13.0 or newer
Do I need to copy the image into the Asset Catalogue as before? What I want to avoid is caveating every instance of their usage, e.g.
if #available(iOS 13.0, *) {
heartImage = UIImage(systemName: "heart.fill")!
} else {
heartImage = UIImage(named: "heart.jpeg")
}
The file name was copy and pasted as per most of the instructions.
What am I missing?
Using SFSymbols is only available from iOS 13, however some specific symbols can also have higher deployment version (14 or 15) Check your project version in your App Target "General" tab and minimum deployment of your desired symbol. If you want to use symbols for lower iOS you should mark creating image with if #available(iOS 13.0, *) as you mentioned. If you want to make fallback image for lower iOS , your only option is to create separate asset with UIImage(named: "heart.jpeg") . You can also export your symbols manually adding it to Assset catalogue, however keep in mind that you need to convert it from SVG format so some other format, because iOS lower than version 13 also does not support SVG preserving vector data.

How to change the compact size class programatically? [duplicate]

This question already has answers here:
How to change font size for iPhone 4 and IPhone 5 in auto-layout?
(1 answer)
Size class to identify iPhone 6 and iPhone 6 plus portrait
(10 answers)
Closed 4 years ago.
I have added size classes on a Label for both compact and regular types.
The compact, however, affects all iPhone devices, but not all iPhone screen sizes are equal, hence I would like to have a bit more granular control and reduce the size for iPhone 5/SE even further.
I haven't been able to find an official way for this so I thought I do it directly in code.
There is the DeviceKit pod that helps to find the iPhone type:
import DeviceKit
override func viewDidLoad() {
super.viewDidLoad()
let groupOfAllowedDevices: [Device] = [.iPhoneSE, .iPhone5s, .iPhone5, .iPhone5c, .simulator(.iPhoneSE), .simulator(.iPhone5s), .simulator(.iPhone5), .simulator(.iPhone5c)]
let device = Device()
if device.isOneOf(groupOfAllowedDevices) {
myLabel.font = UIFont.boldSystemFont(ofSize: 10.0)
}
}
However, this doesn't work. This would only work if I removed both Size Classes in IB.
Is there a way to manipulate the Compact size class of a specific label?
That way I could leave the default compact size for all devices other than iPhone SE/iPhone 5, and only decrease it further for the given two devices.
If there is a better way, I'm happy to know about it. Thanks

Images not appearing on buttons when using old GTK 2 versions

I'm currently testing a GTK 2 version of a program on systems using GTK 2.6, namely Ubuntu 5.04 and Fedora Core 4. I have the issue there that I am unable to create image-only buttons without a label. On later GTK versions (tested with Ubuntu 6.06, Fedora 8 and 10) this works. It looks like this:
Ubuntu 5.04 and Fedora 4 Core:
Ubuntu 6.06 (and similar in Fedora 8 and 10):
I've downloaded GTK 2.6 to find a clue in its documentation, but so far I have not found out why this is happening.
The code I use for image-only buttons is this:
GtkWidget *button = gtk_button_new ();
gtk_button_set_image (GTK_BUTTON (button), gtk_image_new_from_stock (GTK_STOCK_REMOVE, GTK_ICON_SIZE_BUTTON)); // Example
What am I missing here?
(The program is supposed to also run on old and low-end systems, that's why I am bothering with this.)
EDIT
It seems that the behaviour which I had expected was introduced with version 2.8.14, that's why it worked on Ubuntu 6.06 which uses GTK 2.10. This was not obvious to me from reading the documentation.
Packing a stock image into a button by using gtk_container_add() is a way to create labelless image-only buttons when using earlier versions.
Here's the code of gtk_button_set_image as of GTK+ 2.6.9
/**
* gtk_button_set_image:
* #button: a #GtkButton
* #image: a widget to set as the image for the button
*
* Set the image of #button to the given widget. Note that
* it depends on the gtk-button-images setting whether the
* image will be displayed or not.
*
* Since: 2.6
*/
void
gtk_button_set_image (GtkButton *button,
GtkWidget *image)
{
GtkButtonPrivate *priv = GTK_BUTTON_GET_PRIVATE (button);
priv->image = image;
priv->image_is_stock = (image == NULL);
gtk_button_construct_child (button);
g_object_notify (G_OBJECT (button), "image");
}
So first we can see that showing images in the buttons is handled by the gtk-buttons-images setting from your gtkrc, so you may need to force this.
We also see that setting a stock icon can be done differently: you can set as the button label the name of the stock icon, and set the use-stock property of the button to TRUE. So maybe you could try that too.

Algorithmia RecognizeCharacters version 0.3.0 service response time differs on iOS 10 and iOS 11 device

We are using Algorithmia .../ocr/RecognizeCharacters/0.3.0 service to post image png data and get Characters Recognized in our Swift 4 App.
We had tested it earlier on iOS 10 device (iPhone5s) and the time was around 1 minute for single scan.
Now we have two iPhone5s devices. One have latest iOS 11.2.1 OS. The second has old iOS 10.3.3.
We found the following difference of time taken for the same service in both these devices.
Same Photo captured in both devices and scanned.
Response time on both iPhone5s device in our app took
9 Minutes on model with iOS 11.2.1 ( latest os )
1 Minute on model with iOS 10.3.3
We exchanged photos on both devices. ( copied the scanned photo from one device to another and then called the service again on both devices. )
10 Minutes on iOS 11.2.1
2 Minutes on iOS 10.3.3
the following is the code used for the service in our app.
// difference of time
let startDate = Date()
print( "Start Date/time \(startDate)" )
// Algorithmia starts
let client = Algorithmia.client(simpleKey: "*****")
let algo = client.algo(algoUri: "ocr/RecognizeCharacters/0.3.0").pipe( data: image.png )
{ resp, error in
if (error == nil) {
.... code to handle response ....
} else {
.... code to handle error ....
}
// Algorithmia ends
let endDate = Date()
let components = Calendar.current.dateComponents([.hour, .minute], from: startDate, to: endDate)
print( "End Date/time \(endDate) diff \(components.hour ?? 0):\(components.minute ?? 0)" )
}
Why is this difference in case of two different OS?
What need to be changed for the code to run properly on iOS 11 and above?
Is there some changes for the code of services on new iOS OS?
Any clue, links, help will be appreciated.
This was strange.
I received just now the update of iOS os version. to 11.2.2.
As per the hint from #Upholder Of Truth, I updated the same iPhone 5S device.
Started the app, for the same functionality.
The scanning time was back to the normal 1 minute.
relived....