I wanted to use macro to check if a function is returning a particular generic type, say Array, so it is fine if the function is returning Array<Dynamic>, Array<String>, or even generic Array<T>.
So I tried to Context.unify it with Array<Dynamic>. It is fine for Array<String> or Array<Dynamic> but it fails when the type parameter is "generic" because the ComplexType Array<T> won't convert to a Type with Type not found: T (See code below). Are there any possible ways to achieve what I am attempting to do?
package;
#if macro
import haxe.macro.Context;
using haxe.macro.ComplexTypeTools;
#end
#if !macro #:build(Macros.build()) #end
class Main
{
public function test<T>():Array<T>
{
return [];
}
}
class Macros
{
public static function build()
{
#if macro
var fields = Context.getBuildFields();
for(field in fields)
{
switch(field.kind)
{
case FFun(f):
// try to unify Array<String> with Array<Dynamic>
trace(Context.unify((macro:Array<String>).toType(), (macro:Array<Dynamic>).toType()));
// true
// try to unify Array<T> with Array<Dynamic>
trace(Context.unify(f.ret.toType(), (macro:Array<Dynamic>).toType()));
// Type not found: T
default:
}
}
return null;
#end
}
}
UPDATE
So, checking TPath was not the best idea.
Based on the previous assumption about Dynamic being assignable to any type we can replace unconvertable type parameter with the Dynamic (eg Array<T> = Array<Dynamic>) and when try to unify it.
static function toTypeExt(c:ComplexType):Null<haxe.macro.Type>
{
try {
return c.toType();
} catch (error:Error)
{
switch (c)
{
case TPath(p):
//deep copy of the TPath with all type parameters as Dynamic
var paramsCopy = [for (i in 0...p.params.length) TPType(macro:Dynamic)];
var pCopy:TypePath = {
name: p.name,
pack: p.pack,
sub: p.sub,
params: paramsCopy
}
var cCopy = TPath(pCopy);
//convert after
return cCopy.toType();
default:
}
}
return null;
}
Use toTypeExt() in your build macro instead of toType.
trace(Context.unify(toTypeExt(f.ret), (macro:Array<Dynamic>).toType()));
Looks more like a workaround to me, but there is a strange thing about ComplexTypeTools.toType - it will succeed with a class type parameter while failing with method type parameter.
OLD ANSWER
Unification won't work since there is no way of converting ComplexType with the type parameter to Type (in that context). But since you are unifying with Array it is safe to assume that any Array will unify with it (since any type is assignable to Dynamic http://haxe.org/manual/types-dynamic.html).
May be it is not the pritiest solution, but simple TPath check is the way to go here:
case FFun(f):
switch (f.ret) {
case TPath({ name: "Array", pack: [], params: _ }):
trace("Yay!");
default:
}
Related
I'm using swift 5 and try to compile the following code:
protocol BasicProtocol {
associatedtype T
var str: T {get set}
}
struct AItem<U>: BasicProtocol {
typealias T = U
var str: T
init<G: StringProtocol>(str: G) where G == T {
self.str = str
}
}
I got compilation error:
error: Test.playground:10:45: error: same-type requirement makes generic parameters 'G' and 'U' equivalent
init<G: StringProtocol>(str: G) where G == T {
^
How to make them equivalent? or I can't?
Thanks.
Update 1:
This is the problem I encountered: I want to declare struct "AItem", hoping it has a generic type "T". And this generic type will have some restrictions, such as: "T: StringProtocol". Then for some reason, I need to use an array to load these structs, and ensure that the generics of each structure can be set at will.
I learned that there is "type-erase" might can solve this. So I tried this way, but it seemed unsuccessful. The problems mentioned above have occurred.
Update 2:
struct AItem<T: StringProtocol> {
var aStr: T
}
var array: [AItem<Any>] = [AItem(aStr: "asdfasdf")]
Look,If you compile this code, you will get a compilation error:
error: Test.playground:5:13: error: type 'Any' does not conform to protocol 'StringProtocol'
var array: [AItem<Any>] = [AItem(aStr: "asdfasdf")]
^
If I use "var array: [AItem<String>]", I will not be able to put any other non-"String" but implemented "StringProtocol" instance in the array.
This is why I said I want "ensure that the generics of each structure can be set at will".
Update 3:
very thanks for #jweightman, now I update my question again.
protocol ConstraintProtocol {}
extension String: ConstraintProtocol{}
extension Data: ConstraintProtocol{}
extension Int: ConstraintProtocol{}
.......
struct AItem<T = which class has Implemented "ConstraintProtocol"> {
var aPara: T
init(aPara:T) {
self.aPara = aPara
}
}
// make a array to contain them
var anArray: [AItem<Any class which Implemented "ConstraintProtocol">] = [AItem(aPara: "String"), AItem(aPara: 1234), AItem(aPara: Data("a path")), …]
// then I can use any item which in anArray. Maybe I will implement a method to judge these generics and perform the appropriate action.
for curItem in anArray {
var result = handleItem(curItem)
do something...
}
func handleItem<T: ConstraintProtocol>(item: AItem<T>) -> Any? {
if (item.T is ...) {
do someThing
return ......
} else if (item.T is ...) {
do someThing
return ...
}
return nil
}
This is my whole idea, but all of which are pseudo-code.
It seems like type erasure is the answer to your problem. The key idea to the type erasure pattern is to put your strongly typed but incompatible data (like an AItem<String> and an AItem<Data>) inside of another data structure which stores them with "less precise" types (usually Any).
A major drawback of type erasure is that you're discarding type information—if you need to recover it later on to figure out what you need to do with each element in your array, you'll need to try to cast your data to each possible type, which can be messy and brittle. For this reason, I've generally tried to avoid it where possible.
Anyways, here's an example of type erasure based on your pseudo code:
protocol ConstraintProtocol {}
extension String: ConstraintProtocol{}
extension Data: ConstraintProtocol{}
extension Int: ConstraintProtocol{}
struct AItem<T: ConstraintProtocol> {
var aPara: T
init(aPara: T) {
self.aPara = aPara
}
}
struct AnyAItem {
// By construction, this is always some kind of AItem. The loss of type
// safety here is one of the costs of the type erasure pattern.
let wrapped: Any
// Note: all the constructors always initialize `wrapped` to an `AItem`.
// Since the member variable is constant, our program is "type correct"
// even though type erasure isn't "type safe."
init<T: ConstraintProtocol>(_ wrapped: AItem<T>) {
self.wrapped = wrapped
}
init<T: ConstraintProtocol>(aPara: T) {
self.wrapped = AItem(aPara: aPara);
}
// Think about why AnyAItem cannot expose any properties of `wrapped`...
}
var anArray: [AnyAItem] = [
AnyAItem(aPara: "String"),
AnyAItem(aPara: 1234),
AnyAItem(aPara: "a path".data(using: .utf8)!)
]
for curItem in anArray {
let result = handleItem(item: curItem)
print("result = \(result)")
}
// Note that this function is no longer generic. If you want to try to "recover"
// the type information you erased, you will have to do that somewhere. It's up
// to you where you want to do this.
func handleItem(item: AnyAItem) -> String {
if (item.wrapped is AItem<String>) {
return "String"
} else if (item.wrapped is AItem<Data>) {
return "Data"
} else if (item.wrapped is AItem<Int>) {
return "Int"
}
return "unknown"
}
An alternative to type erasure you could consider, which works well if there's a small, finite set of concrete types your generic could take on, would be to use an enum with associated values to define a "sum type". This might not be a good choice if the protocol you're interested in is from a library that you can't change. In practice, the sum type might look like this:
enum AItem {
case string(String)
case data(Data)
case int(Int)
}
var anArray: [AItem] = [
.string("String"),
.int(1234),
.data("a path".data(using: .utf8)!)
]
for curItem in anArray {
let result = handleItem(item: curItem)
print("result = \(result)")
}
func handleItem(item: AItem) -> String {
// Note that no casting is required, and we don't need an unknown case
// because we know all types that might occur at compile time!
switch item {
case .string: return "String"
case .data: return "Data"
case .int: return "Int"
}
}
I have a type called Setting that takes a generic type parameter as such:
Setting<T>
Every setting contains a value that can be an Int32, String, Bool, or a custom object type, etc. Here is some of the full implementation of Setting:
class Setting<T> {
var key:String?
var defaultValue:T?
//...
}
This all works with various type params as expected, however, now there is a requirement for a collection that contains multiple Setting objects that could have various type parameters. When I declare an array variable of type [Setting], obviously the compiler expects a type which is unknown at runtime.
I've tried using a protocol and an extension on the types that could be used for the generic type parameter such as this:
protocol SettingProtocol {
func getType() -> Self.Type
}
extension Int32:SettingProtocol {
func getType() -> Int32.Type {
return Int32.self
}
}
extension String:SettingProtocol {
func getType() -> String.Type {
return String.self
}
}
//...
and declaring my array as
var settings = [Setting<SettingProtocol>]()
but this does not work when I try to append a Setting instance to the array as follows:
var newSetting = Setting<String>()
newSetting.setDefaultValue(value: "SomeString")
settings?.append(newSetting) // compile error here
and results in the following compiler error:
Cannot convert value of type 'Setting<String>' to expected argument type 'Setting<SettingProtocol>'
Also, using the protocol/extension route might require an extension on every type that might be encountered when building these objects which seems really clunky.
I feel like there should be a way to accomplish this. Also hoping that when I pull these items out of the array that I can avoid a lot of type checking.
Can anyone offer any advice?
Change
class Setting<T>
to
class Setting<T:SettingProtocol>
and try compiling.
Actually, you can't define:
var settings = [Setting<SettingProtocol>]()
because the generic type of Setting must be one of the concrete types but not the protocol itself. For example, you could declare it as:
var settings = [Setting<String>]() // since you already implemented extension String:SettingProtocol { ...
Therefore you could append objects of type Setting<String>, however that's not what are you looking for, you need settings to be a heterogeneous container.
So what you could do is:
class Setting {
var key:String?
var defaultValue:SettingProtocol?
}
protocol SettingProtocol { }
extension Int32:SettingProtocol {}
extension String: SettingProtocol {}
At this point, you declared defaultValue to be of type SettingProtocol, without the need of dealing with a generic.
Therefore:
var newStringSetting = Setting()
newStringSetting.defaultValue = "My String"
settings.append(newStringSetting)
var newInt32Setting = Setting()
newInt32Setting.defaultValue = Int32(100)
settings.append(newInt32Setting)
for setting in settings {
print(setting.defaultValue)
// Optional("My String")
// Optional(100)
}
I am a Swift developer and new to Dart. I am trying to write some generic method.
I would like to know if I can achieve similar thing like this in Dart.
//Swift version
public func modelFrom<T: Mappable>(response: Encodable?, model: T.Type) -> T? {
if let response = response, let string = response.jsonString {
return T(JSONString: string)
}
return nil
}
// The method can be called like this
let responseModel = modelFrom(response: response, model: FoodLogModel.self)
For Dart, can I cast the type of the generic class? like <T: Mappable> in Swift?
Is runtimeType in Dart equivalent to <ClassName>.self in Swift?
Many thanks
Here is what I have tried,
import 'package:dartson/dartson.dart';
import 'package:mobile_corelib/base/model.dart';
T requestFrom<T>(BaseModel model, Type T) {
try {
var dson = new Dartson.JSON();
var object = dson.map(dson.encode(model), T.runtimeType);
return object;
} catch(error) {
return null;
}
}
var dick = requestFrom(model, AccessTokenRequest().runtimeType)
But I don't know how to pass in the Class type. Should I use dynamic? or Type
runtimeType in Dart is a value of type Type, which is not the same thing as what happens when you write a type in a generic type parameter.
As an example, new List<String>() is valid, but new List<''.runtimeType>() is a syntax error.
Generally, Type is quite limited. You can't use it to create new instances--unless you're in the VM, where you can use mirrors.
I'm familiar with generic classes in TypeScript, where a class can be defined with an associated type variable, and then instances with a specific type can manipulate and return values of that type.
Problem: I want a generic class that creates instances of the type variable. For instance:
class Thing {
thingProp: string;
}
class ThingOne extends Thing {
thingOneProp: string;
}
class ThingTwo extends Thing {
thingTwoProp: string;
}
class Maker<T extends Thing> {
make(): T {
return new T();
// ^--- " // <- "error TS2304: Cannot find name 'T'""
}
}
let thingOneMaker = new Maker<ThingOne>();
let thingOne: ThingOne = thingOneMaker.make();
let thingTwoMaker = new Maker<ThingTwo>();
let thingTwo: ThingTwo = thingTwoMaker.make();
let thingError: ThingOne = thingTwoMaker.make();
// ^--- "error TS2322: Type 'ThingTwo' is not assignable to type 'ThingOne'"
This almost seems to work. The compiler generates code, and the error on the last line shows that TypeScript understands what type thingTwoMaker.make() should return.
However, the error on return new T(); shows that TypeScript doesn't understand that I'm trying to make instances of the type variable's class, and the generated JavaScript confirms it:
var Maker = (function () {
function Maker() {
}
Maker.prototype.make = function () {
return new T(); // <- "error TS2304: Cannot find name 'T'"
};
return Maker;
}());
And, not surprisingly, running the generated JavaScript with Node.js produces a ReferenceError: T is not defined error.
How can I make a generic class whose instances can create instances of the type variable class? (Tested using TypeScript 2.0.10.)
Somehow, you need to give the Maker class the constructor of the type you'd like it to make, so that it has a value on which to call new.
I'd say a good option would be to pass the class constructor as an argument to Maker's constructor. That will allow it to construct instances of that class, and it will automatically infer the type that it's building so you don't have to manually annotate the generic type anymore.
So maybe something like this:
class Maker<T extends Thing> {
private ctor: {new(): T};
constructor(ctor: {new(): T}) {
this.ctor = ctor;
}
make(): T {
return new this.ctor();
}
}
Then, you can pass the right class constructor to each kind of Maker, and the type will be automatically inferred:
let thingOneMaker = new Maker(ThingOne);
let thingOne: ThingOne = thingOneMaker.make();
let thingTwoMaker = new Maker(ThingTwo);
let thingTwo: ThingTwo = thingTwoMaker.make();
// Still an error.
let thingError: ThingOne = thingTwoMaker.make();
Playground link.
Long time listener, first time caller.
I'm getting the following error:
Cannot convert value of type MyClass<Model<A>, OtherClass> to expected argument type MyClass<Protocol, OtherClass>
Despite the fact that MyClass<T> conforms to Protocol
I've attached a snippet that can be run in Playgrounds that resembles what I am actually trying to achieve.
protocol DisplayProtocol {
var value: String { get }
}
class DataBundle<T: CustomStringConvertible>: DisplayProtocol {
var data: T
var value: String {
return data.description
}
init(data: T) {
self.data = data
}
}
class Mapper<DisplayProtocol, Data> {
// ...
}
class MapperViewModel<Data> {
let mapper: Mapper<DisplayProtocol, Data>
init(mapper: Mapper<DisplayProtocol, Data>) {
self.mapper = mapper
}
}
let dataBundle = DataBundle<Int>(data: 100)
let mapper = Mapper<DataBundle<Int>, Bool>()
let viewModel = MapperViewModel<Bool>(mapper: mapper) // <- This fails w/error
Is this the expected behavior? If it is it feels like its breaking the contract of allowing me to have the DisplayProtocol as a type in Mapper.
This is caused by the fact that Swift generics are invariant in respect to their arguments. Thus MyClass<B> is not compatible with MyClass<A> even if B is compatible with A (subclass, protocol conformance, etc). So yes, unfortunately the behaviour is the expected one.
In your particular case, if you want to keep the current architecture, you might need to use protocols with associated types and type erasers.