A, B and C are three boolean values. Write an expression in terms of only !(not) and &&(and) which always gives the same value as A||B||C - boolean

Chanced upon this beautiful problem. Since I am new to Boolean expressions, it is looking quite difficult.
I guess parentheses can be used.
If one of A, B, C is true, A||B||C must be true. Using AND and NOT, it can be done but, how do we know which one has which value?
I tried using truth tables, but three variables were too much.
Any ideas on how to solve, or at least how to make it faster?

Learn De Morgan's laws. It's a little piece of a basic knowledge for a programmer.
They state, among others, that not(X or Y) = (not X) and (not Y).
If you negate both sides and then apply the formula twice—first to ((A or B) or C), treating the (A or B) subexpression as X, and then to (A or B) itself—you'll get the desired result:
A || B || C =
(A || B) || C =
!(!(A || B) && !C) =
!((!A || !B) && !C) =
!(!A && !B && !C)

DeMorgan's Law (one of them, anyway), normally applied to two variables, states that:
A or B == not (not A and not B)
But this works equally well for three (or more) variables:
A or B or C == not (not A and not B and not C)
This becomes obvious when you realise that A or B or C is true if any of them are true, the only way to get false if if all of them are false.
And, only if they're all false will not A and not B and not C give true (hence not(that) will give false). For confirmation, here's the table, where you'll see that the A or B or C and not(notA and notB and notC) columns give the same values:
A B C A or B or C notA notB notC not(notA and notB and notC)
----- ----------- -------------- ---------------------------
f f f f t t t f
f f t t t t f t
f t f t t f t t
f t t t t f f t
t f f t f t t t
t f t t f t f t
t t f t f f t t
t t t t f f f t

Related

Explanation of class instance declarations

I am following a tutorial and found this code:
data A = B | C deriving(Eq)
class K a where
f :: a -> Bool
instance K A where
f x = x == C
f _ = False
call = f B
Why do I need f _ = False? I get the same result without it.
The answer is simply: you don't need f _ = False here. In fact, if you compile with -Wall then the compiler will warn you that this clause is redundant, because the f x = ... clause already catches everything.
If the tutorial told you to have that extra clause, well, it's wrong.
As pointed out, it's not necessary.
You might need (or want) that line, though, if you had a slightly different definition, one that does not require an Eq instance:
data A = B | C
class K a where
f :: a -> Bool
instance K A where
f C = True
f _ = False
Instead of comparing x to C, you can match the argument directly against C, then define f to return False for all other values. This makes more sense if there were more constructors that could produce False.
data A' = B | C | D
instance K A' where
f C = True
f _ = False -- in place of f B = False and f D = False

Rearrange Boolean expression by replacing brackets

I was trying to check online if there is any equivalent expression to 1. (A AND B) OR C, by removing the brackets and rearranging the operands and boolean operators in the above expression.
Like 2. (A OR B) AND C = A AND C OR B AND C.
If I try to solve the first expression with the same logic as above, it doesn't seem logically equivalent I. E. A OR C AND B OR C . I want to remove the brackets from the expression. That's my main aim
What do you mean by "remove the brackets"?
When you write
(A OR B) AND C = A AND C OR B AND C
what does mean "A AND C OR B AND C"?
Does it means
A AND (C OR B) AND C
or
(A AND C) OR (B AND C)
Probably the latter, but it is only because you see implicit brackets around the AND expressions.
One generally considers that "AND" has a higher priority (precedence in programming languages) than "OR". Like "×" has a higher priority than "+".
and
a×b+c
is generally not ambiguous in terms of interpretation.
And it is true that "×" can be distributed over a sum, while the opposite is not true.
But there is no such thing in boolean algebra, and "AND" and "OR" have similar properties.
And they have the same distributivity.
So if your question is
"can we distribute an "OR" over an "AND" expression?",
the answer is yes.
(A AND B) OR C = (A OR C) AND (B OR C)
(just consider what happens when C=0 and when C=1 to verify that it is true)
If your question is
"does such an expression can be expressed without parenthesis with the rule that AND expressions are implicitely surrounded by parenthesis?",
the answer is also yes. Just write
A AND B OR C
with the precedence rule, it is interpreted as
(A AND B) OR C
You can also write
A.B+C
to make clearer that "AND" is generally considered as equivalent to "×" and "OR" equivalent to "+" in terms of precedence.
And in programming languages like C, when one writes
A & B | C
it is clearly interpreted as
(A & B) | C
and the same is true for
A && B || C
So with the "implicit parenthesis around AND expressions" rule, you can write
A AND B OR C = (A OR C) AND (B OR C)
(A OR B) AND C = A AND C OR B AND C
Note that in either case, you need parenthesis in one side of the equality.

Coq: How to prove if statements involving strings?

I have a string a and on comparison with string b, if equals has an string c, else has string x. I know in the hypothesis that fun x <= fun c. How do I prove this below statement? fun is some function which takes in string and returns nat.
fun (if a == b then c else x) <= S (fun c)
The logic seems obvious but I am unable to split the if statements in coq. Any help would be appreciated.
Thanks!
Let me complement Yves answer pointing out to a general "view" pattern that works well in many situations were case analysis is needed. I will use the built-in support in math-comp but the technique is not specific to it.
Let's assume your initial goal:
From mathcomp Require Import all_ssreflect.
Variables (T : eqType) (a b : T).
Lemma u : (if a == b then 0 else 1) = 2.
Proof.
now, you could use case_eq + simpl to arrive to next step; however, you can also match using more specialized "view" lemmas. For example, you could use ifP:
ifP : forall (A : Type) (b : bool) (vT vF : A),
if_spec b vT vF (b = false) b (if b then vT else vF)
where if_spec is:
Inductive if_spec (A : Type) (b : bool) (vT vF : A) (not_b : Prop) : bool -> A -> Set :=
IfSpecTrue : b -> if_spec b vT vF not_b true vT
| IfSpecFalse : not_b -> if_spec b vT vF not_b false vF
That looks a bit confusing, the important bit is the parameters to the type family bool -> A -> Set. The first exercise is "prove the ifP lemma!".
Indeed, if we use ifP in our proof, we get:
case: ifP.
Goal 1: (a == b) = true -> 0 = 2
Goal 2: (a == b) = false -> 1 = 2
Note that we didn't have to specify anything! Indeed, lemmas of the form { A
} + { B } are just special cases of this view pattern. This trick works in many other situations, for example, you can also use eqP, which has a spec relating the boolean equality with the propositional one. If you do:
case: eqP.
you'll get:
Goal 1: a = b -> 0 = 2
Goal 2: a <> b -> 1 = 2
which is very convenient. In fact, eqP is basically a generic version of the type_dec principle.
If you can write an if-then-else statement, it means that the test expression a == b is in a type with two constructors (like bool) or (sumbool). I will first assume the type is bool. In that case, the best approach during a proof is to enter the following command.
case_eq (a == b); intros hyp_ab.
This will generate two goals. In the first one, you will have an hypothesis
hyp_ab : a == b = true
that asserts that the test succeeds and the goal conclusion has the following shape (the if-then-else is replaced by the then branch):
fun c <= S (fun c)
In the second goal, you will have an hypothesis
hyp_ab : a == b = false
and the goal conclusion has the following shape (the if-then-else is replaced by the else branch).
fun x <= S (fun c)
You should be able to carry on from there.
On the other hand, the String library from Coq has a function string_dec with return type {a = b}+{a <> b}. If your notation a == b is a pretty notation for string_dec a b, it is better to use the following tactic:
destruct (a == b) as [hyp_ab | hyp_ab].
The behavior will be quite close to what I described above, only easier to use.
Intuitively, when you reason on an if-then-else statement, you use a command like case_eq, destruct, or case that leads you to studying separately the two executions paths, remember in an hypothesis why you took each of these executions paths.

Propositional formula for A, B, AC and BC

I am struggling trying to get a propositional formula of the following truth table: A, B, AC, BC.
For A and B it's easy: A xor B. However, when you insert a new literal C...
I tried using Wolfram by inputting the truth table (A & ~B & ~C) || (~A & B & ~C) || (A & ~B & C) || (~A & B & C). However, the suggested minimal forms are wrong since they do not consider C.
Can someone help expressing this in propositional logic using logical connectives like (A xor B) => C? Thanks!
You can perform minimization by means of using Karnaugh maps (amongst other methods - this one is the simpliest, you'll have to introduce a dummy variable D and just ignore it in the results).
The solutions are right about not considering C, though - it doesn't matter what the C evaluates to, as long as A xor B evaluates to true. I just did check that, to remind myself about how the Karnaugh maps are constructed. Try drawing yourself a full truth table to see that.
Take a look at the expression:
(A & ~B & ~C) || (~A & B & ~C) ||
(A & ~B & C) || (~A & B & C)
Both lines are identical except for the negation of C, this means that C is irrelevant: the value for C doesn't change anything to the output of the function.
This is also the conclusion one draws from a truth table:
|A|B|C||F|
+-+-+-++-+
|F|F|F||F|
|F|F|T||F|
|F|T|F||T|
|F|T|T||T|
|T|F|F||F|
|T|F|T||F|
|T|T|F||F|
|T|T|T||F|
Here F being the outcome of the expression. If you for instance take the first line: |F|F|F||F| the result is false, which is the same for |F|F|T||F| (with C flipped). By doing this for every (A,B) configuration one sees that the value of C doesn't matter.
Therefore you can simply exclude C from the formula resulting in:
(A & ~B) || (~A & B)
Which means A xor B.
Wolfram Alpha comes to the same conclusion (see ANF expression).
I have the answer
(A xor B) and (C => (A or B))

How are these boolean expressions (truth tables) equivalent?

I am trying to better understand boolean equivalence but this example has me a little stuck.
I am referring to this website: http://chortle.ccsu.edu/java5/Notes/chap40B/ch40B_9.html
It makes sense, but doesn't at the same time... it says that they are equivalent, but the true/false values don't add up/align in a way that makes them out to be equivalent as the table shows they are. Could someone explain this to me?
!(A && B) <-- first expression
(C || D) <-- second expression
The last columns refers to the equivalency of the two expressions, which yes, they are equivalent according to the table. However, I just don't get how the two expressions are equivalent. If A = F, B = F --> T, wouldn't C = F, D = F --> T as well?
A B C D
--------------------
F F T T T
F T T F T
T F F T T
T T F F F
You are confusing yourself when trying to reduce it from the actual expression to single letter variables. On referring the actual link, it would appear that the variables you use can be mapped to the original expressions as follows:
A = speed > 2000
B = memory > 512
C = speed <= 2000
D = memory <= 512
If you look at it, C equals !A and D equals !B. So the expression (C || D) is effectively !((!A) || (!B)). By De Morgan's Law, that is the same as !(A && B).
This table is explaining that !(A && B) is equivalent to !A || !B. The columns C and D appear to be defined as C = !A and D = !B. The last column is C || D
So A = F, B = F certainly implies !(A && B). In this case C = D = T, and so also C || D = T.