proj-oot-ootLogicNotes1

I guess if we are doing 'RDFt' we should generalize this to add other modifiers (qualifiers and quantitiers, but other stuff too, i guess) to RDF, e.g. certainty level, provenance/source of evidence, mode, scope (time and place at which the assertion is made; these are like propositional phrases in English i guess), forall/exists,exactlyone, etc. Consider adding the other Kantian categories too. Hmm, that generalization is pretty interesting...

mb call this idea 'RDF+'

the tinkerpop guys are already onto 'RDFt': http://nosql.mypopescu.com/post/21377687732/distributed-temporal-graph-database-using-datomic

--

and of course we need the ontological basics too.. can start with looking at semweb stuff like OWL (and RDF and N3) and topic maps

--

need a syntax to encompass most logics e.g. most deductive inference systems, as well as the basics of proof theory, model theory

e.g. a general framework for definining logic programming languages like Prolog or constraint satisfaction languages

--

is the benefit o RDFt over traditional data structures obtainable by adding roles to datastructures and autoconverting, e.g. homomorphisms?

--

http://en.wikipedia.org/wiki/Refal ( http://en.wikipedia.org/wiki/Markov_algorithm )

--

to look into:

--

www.ksl.stanford.edu/people/stolle/Papers/meta96.ps

adds control clauses to prolog to allow programmer to control order of goal selection during theorem proving

--

" Metaprolog..the extensions are primarily aimed at allowing programs to manipulate multiple theories, allowing distinct theories to have multiple 'viewpoints' of a relation, and to make proofs available as terms in the language' "

"Reflective Prolog...allows the programmer to extend the object-level provability relation by clauses to be used at the metalevel. However, a Reflective Prolog program is an amalgamated theory (bayle note: i think they mean that the object level assertions and the metalevel assertions live in the same space), containing object level axioms as well as metalevel axioms. For example, the clauses

solve(#P($X, $Y)) :- reflexive(#P), solve(#P($Y, $X)). reflexive(<spouse>).

(where #P is a metalevel variable ranging over names of predicate symbols while $X and $Y are metalevel variables ranging over names of terms) say that a binary atom, whose predicate symbol denotes a reflexive relation, can be proved by instead proving an atom whose arguments have been exchanged with respect to the original atom; the predicate 'spouse' denotes such a relation "

the Godel language has metaprogramming modules for datatypes representing Godel programs and theories, and also predicates that attempt to prove a formula

MOL (i didnt understand this part)

Meta by Kowalski... name each theory by opaque terms that do not themselves indicate the axioms for that theory.. but you can give those separately.

Combinatory Logic Programming... makes for more efficient implementations

Truth predicates "Sato "Metaprogramming through a truth predicate" has investigated the use of a self-referential truth predicate, given semantics in three-valued logic, for metaprogramming. Jiang "on the semantics of real metalogic programming.. advocates a language with a nonground representation that does not distinguish between terms and sentences.."

Alloy "combines ideas from Reflective Prolog (automatic meta-to-object reflection as an inference rule) and Kowalski's Meta (the representation of theories and the application of meta-to-object reflection also to program clauses)."

so, 3 uses of metaprogramming in logic (Barklund counts 4 but i identify 2 of his): writing/extending compilers etc; controlling procedural behavior; representing/reasoning about knowledge

Barklund speaks a lot about how it is problematic to use 'improper nonground representations', in metalogic, that is, representations in the metalanguage of expressions in the base language which contain metavariables, because there is "a high risk that the programmer introduces subtle errors by unintentional instantiation of metavariables that actually represent object variables". i don't understand this, too bad Barklund didn't give examples.

-- Metaprogramming in Logic by J Barklund - ‎1994 - CiteSeerX? citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.52.2769

---

https://en.wikipedia.org/wiki/Answer_set_programming

--

upvote

mafribe 13 hours ago

link

Neither Coq nor Isabelle are based on Martin-Löf Type Theory (in a narrow sense). Coq is based on a variant of the calculus of construction (with (co)inductive definitions) and Isabelle is an LCF-style prover framework that you can instantiate with various different logics. The most widely used variant is Isabelle/HOL which implements a variant of Church's theory of types. To implement other logics, Isabelle uses intuitionistic second order logic based on lambda calculus, without numbers or recursive data types. Idris has Type:Type, so is unsound as a logic. I'm not sure exactly what Epigram uses, but I assume it's somehow based on Luo's ECC. Only Agda is based on Martin-Löf Type Theory.

reply

upvote

tel 12 hours ago

link

Yeah... This is correct as far as I know though a bit out of my complete technical grasp. To my understanding MLTT was a fairly central topic and it helps to distinguish MLTT from just type theory since plain-old-TT might be harder to distinguish as something interesting above `(int *)`.

But, to solidify, mafribe is much more correct than my comment and there's, as far as I'm aware, a much larger menagerie of other type theories. The recent Homotopy Type Theory publication—which was on HN as notable due to its publication process—was yet another step in this direction.

reply

--

hmm.. even if you use relevance logic instead of the material conditional, you give up truth-functionality but you still have some other stuff:

---

probabalistic programming:

http://en.wikipedia.org/wiki/Probabilistic_programming_language

http://probabilistic-programming.org/wiki/Home

http://radar.oreilly.com/2013/04/probabilistic-programming.html

https://projects.csail.mit.edu/church/wiki/Church

http://probcomp.csail.mit.edu/venture/

http://probabilistic-programming.org/wiki/NIPS*2012_Workshop

http://probcomp.csail.mit.edu/bayesdb/

---

trellis and prolog both could usee an assertion/truth maintanence syntax

---

---

https://www.fpcomplete.com/school/to-infinity-and-beyond/pick-of-the-week/a-little-lens-starter-tutorial#-sidenote--what-is-that-forall-thing-- 's forall:

" >>> view _1 ("goal", "chaff") "goal"

>>> forall $ \tuple -> view _1 tuple == fst tuple True "

forall is being used here as a form of an assertion; it's an assertion with a variable in it

---

Ceptre: A Language for Modeling Generative Interactive Systems.

Ceptre: A Language for Modeling Generative Interactive Systems. Chris Martens 2015

    We present a rule specification language called Ceptre, intended to enable rapid prototyping for experimental game mechanics, especially in domains that depend on procedural generation and multi-agent simulation.
    Ceptre can be viewed as an explication of a new methodology for understanding games based on linear logic, a formal logic concerned with resource usage. We present a correspondence between gameplay and proof search in linear logic, building on prior work on generating narratives. In Ceptre, we introduce the ability to add interactivity selectively into a generative model, enabling inspection of intermediate states for debugging and exploration as well as a means of play.
    We claim that this methodology can support game designers and researchers in designing, anaylzing, and debugging the core systems of their work in generative, multi-agent gameplay. To support this claim, we provide two case studies implemented in Ceptre, one from interactive narrative and one from a strategy-like domain.

Some choice quotes from the artice follow.

Simple examples of the rule language:

    The meaning of A -o B, to a first approximation, is that whenever the predicates in A are present, they may be replaced with B. One example of a rule is:
    do/compliment:
         at C L * at C’ L * likes C C’
      -o at C L * at C’ L * likes C C’ * likes C’ C.
    [...]
    Note that because of the replacement semantics of the rule, we need to reiterate everything on the right-hand side of the -o that we don’t want to disappear, such as the character locations and original likes fact. We use the syntactic sugar of prepending $ to anything intended not to be removed in order to reduce this redundancy:
    do/compliment: $at C L * $at C’ L * $likes C C’ -o likes C’ C.
    A more complex rule describes a murder action, using the ! operator to indicate a permanent state:
    do/murder:
        anger C C’ * anger C C’ * anger C C’ * anger C C’
        * $at C L * at C’ L * $has C weapon
      -o !dead C’.
    (This rule consumes C’s location, maintaining a global invariant that each character is mutually exclusively at a location or !dead.) Here we see a departure from planning formalisms: four instances of anger C C’ mean something different from one. Here we are using an emotion not just as a precondition but as a resource, where if we have enough of it, we can exchange it for a drastic consequence. Whether or not we diffuse the anger, or choose to keep it by prepending $ to the predicates, is an authorial choice.

Concurrency in narration:

    Two rule applications that consume disjoint sets of resources from the same state can be said to happen concurrently, or independently. On the other hand, a rule that produces resources and another that consumes a subset of them can be said to be in a causal, or dependent, relationship. Less abstractly, if resources represent facts associated with particular game entities or characters, then independent rule applications represent potentially concurrent action by multiple such entities, and causally related rule applications represent either sequential action by a single actor, or synchronized interaction between two entities.

Stages, and a larger example:

    We would like to for some of these rules to run automatically without player intervention. In our next iteration of the program, we will make use of a Ceptre feature called stages. Stages are a way of structuring a program in terms of independent components. Syntactically, a stage is a curly-brace-delimited set of rules with an associated name. Semantically, a stage is a unit of computation that runs to quiescence, i.e. no more rules are able to fire, at which point control may be transfered to another stage.
    [...]
    Additionally, we can test the design by “scripting” certain player strategies. For instance, we could augment the two rules in the fight stage to be deterministic, fighting when the monster can’t kill us in one turn and fleeing otherwise:
    stage fight = {
      do_fight:
          choice * $fight_in_progress * $monster Size * $health HP * Size < HP
        -o try_fight.
      do_flee :
          choice * fight_in_progress * $monster Size * $health HP * Size >= HP
        -o flee_screen.
    }
    If we remove interactivity from this stage, then we get automated combat sequences that should never result in the player’s death.

By gasche at 2015-08-04 14:03

Logic/Declarative login or register to post comments other blogs 2504 reads

---

we need some language to express guarantees: eventual, C,A,P, atleastonce, atmostonce, A,C,I,D, B,A,S,E, ordered (and all these with rspect to certain operations or classes of ops, eg availability wrt read ops but not wrritee ops; i guess capabilities can use the same ops descr language), guaranteed, best-effort, permitted/forbidden required/optional, deterministic/stochastic, before/after/during, known/unknown are these alphanumeric identifier prefixed with a sigil that means "<ome sort of modal logical thing"? mb "?" is the sigil?

---

note: sigi uses 'connection words' "and, does, has, is" in http://communitywiki.org/en/nLSD. That's a nice set of beginning 'predicate' like things.

note: unlike has, is, does, sigi's connection word 'and' is being used not so much as an ordinary relation (which is like a clause, i guess; "Bob is blue"), but rather to create a noun phrase, which is like a struct

---

todo: check out https://en.wikipedia.org/wiki/Common_Logic

http://www.jfsowa.com/talks/clintro.pdf also describes the IKL (Interoperable Knowledge Language) extension to Common Logic, which Sowa seems to like (more on IKL: http://www.ihmc.us/users/phayes/IKL/GUIDE/GUIDE.html http://philebus.tamu.edu/cl/docs/cl/SemTech2008/Menzel-SemTech2008.pdf http://philebus.tamu.edu/cl/docs/cl/SemTech2008/cl_sowa.pdf

http://www.iso-commonlogic.org/

also i've already looked into it a bit, but "topic maps" should be noted, for later readers

note: Common Logic is a later version/offshoot of KIF https://en.wikipedia.org/wiki/Knowledge_Interchange_Format

---

need to distinguish 'is' into:

and we also need modals for:

eg :

---

sometime look into "conceptual graphs" too

maybe also look into Murray Altheim (from CW)'s project, Ceryle (which seems to have been deleted, but here's some of the docs): http://web.archive.org/web/20081029023001/http://altheim.4java.ca/ceryle/wiki/Wiki.jsp?page=UserDocumentation

unfortunately i cant seem to find copies of his ontologies: http://web.archive.org/web/20070321051852/http://altheim.4java.ca/ceryle/wiki/Wiki.jsp?page=CeryleOntologies . The docs site for Ceryle is down and the SourceForge? doesn't have any publicly available file downloads or sourcecode

maybe email Murray and ask him to dig them up someday, if i actually have time to read them.

(why was Ceryle deleted? no archived trace of any 'goodbye' blog post that i can see: http://web.archive.org/web/20110809051814/http://www.altheim.com/ceryle/blog/index.html )

---

i think we need indexicals: here, now, i you wee theey

second-person demonstratives (like 'this' (near demonstrative) and 'that' (far demonstrative), but is a thing that is 'near you' instead of 'near me'; apparently Japanese has these, acc. to Wikipedia's 'demonstrative' article)

---

should have language to represent:

interpretation of formula G under framework F in context C is equivalent to framework F2 formula H (in context D) and instead of IS EQUIVALENT, al IS APPROXIMAATLY, MAY BE/ I THINK, and also variants of these that imply a set of possibly-not-equivalent-to-eachother candidate translations (i guess these are std modals applied to it, though?)

---

proposed method additions to HTTP:

http://www.iana.org/assignments/http-methods/http-methods.xhtml

PATCH CREATE

REST extension/superset but with more/additional verbs, subtypes of POST (idmpotent non-PUT? (sid ffects? or should that b included in PUT), impure GET(again, should that even be different?). TRADE, buy, sell. exercise option (USE?) allocate, free? load, store, mov? if?

rest+order flow processing for POSTs

---

rest extended to multiple args, eg indirect obj (alice gives apple to bob), tuple (alice hascolor blue), trinary operators (compare a to b using lessthan/equal (returns -1/0/1/err)

---

https://en.wikipedia.org/wiki/JSON#Object_references

http://json-ld.org/

---

an example of the sort of application that integrated logic programming might be good for:

parliamentary procedure software where the parliamentary procedure is encoded in logic (this allows you to add things like legal rights to the procedure, to demand logical consistency between ordinary bills and the organization's constitution, to prove whether various hypothetical rights are consistent with current laws/rules, to adjudicate some issues automatically, etc). But this logic needs to interface with stateful I/O for things such as recording votes and logging in members. And things such as vote-counting/voting decision algorithms are probably more easily expressed using functional or even imperative programming style rather than prolog-ish logic.

---

in logic, we might want operators that apply directly to predicates and transform them (including but not limited to: argument list rearrangements, an 'identity' function on predicates that acts as a way to test the identity of two predicates when one of the arguments is set to, first x1, and then x2 (that is, we want to be able to define I(predicate), where I(Pxyz) iff (Pxz iff Pyz); z can stand for any number of additional arguments)

---

interesting logical structure: "Bob has failed as a farmer". The attribute "failure" is specific to the role "farmer"; perhaps Bob succeeded in another role. "Role" is like an Oot "view" for the logical subject.

---

example proof annotations for code:

---

http://web.archive.org/web/20150619164601/http://www.emu.edu.tr/aelci/Courses/D-318/D-318-Files/plbook/semantic.htm has a lot of good examples of logical stuff that we should support:

---

let's look for alternate logical notations and syntaxes, so that we can design a syntax for Oot that accomodates most of them:

https://www.google.com/search?client=ubuntu&hs=FzG&channel=fs&q=%22logic+notations%22&oq=%22logic+notations%22&gs_l=serp.3...9642.13195.0.13320.21.10.0.0.0.0.0.0..0.0....0...1c.1.64.serp..21.0.0.C-jXT-YWtaE https://www.google.com/search?client=ubuntu&hs=reb&channel=fs&q=%22logical+notations%22&oq=%22logical+notations%22&gs_l=serp.3...9988.10121.0.10234.2.2.0.0.0.0.0.0..0.0....0...1c.1.64.serp..2.0.0.s6Wxg_QE75E https://www.google.com/search?client=ubuntu&hs=nzG&channel=fs&q=%22notations+for+logic%22&oq=%22notations+for+logic%22&gs_l=serp.3...5899.8229.0.8355.12.7.0.0.0.0.0.0..0.0....0...1c.1.64.serp..12.0.0.Rb6aMiPWQSI https://www.google.com/search?client=ubuntu&hs=0zG&channel=fs&q=%22notations+for+logics%22&oq=%22notations+for+logics%22&gs_l=serp.3...5014.5014.0.5136.1.1.0.0.0.0.0.0..0.0....0...1c.1.64.serp..1.0.0.6U8ywj1HQAw https://www.google.com/search?client=ubuntu&hs=oKw&channel=fs&q=alternative+notations+for+logics&oq=alternative+notations+for+logics&gs_l=serp.3...26153.31509.0.33373.14.9.0.0.0.0.0.0..0.0....0...1c.1.64.serp..14.0.0.zmRgBPQucMg https://www.google.com/search?client=ubuntu&hs=I2G&channel=fs&q=%22alternative+notation%22+logic&spell=1&sa=X&ved=0ahUKEwjXz6zRwevKAhUByGMKHYaTD_sQvwUIGigA&biw=1200&bih=645 https://www.google.com/search?client=ubuntu&channel=fs&q=logics+syntaxes&oq=logics+syntaxes&gs_l=serp.3..0i71l8.48047.48122.0.48262.2.2.0.0.0.0.0.0..0.0....0...1c.1.64.serp..2.0.0.jumUI15Gz7E https://www.google.com/search?client=ubuntu&channel=fs&q=logics+%22syntaxes%22&oq=logics+%22syntaxes%22&gs_l=serp.3...10398199.10399738.0.10399862.2.2.0.0.0.0.434.509.1j4-1.2.0....0...1c.1.64.serp..0.0.0.ZTQiiJAjMf8 https://www.google.com/search?client=ubuntu&channel=fs&q=%22logics%22+%22syntaxes%22&oq=%22logics%22+%22syntaxes%22&gs_l=serp.3..0i71l8.17247.17525.0.17625.2.2.0.0.0.0.0.0..0.0....0...1c..64.serp..2.0.0.sSBNAdP5_Jw https://www.google.com/search?client=ubuntu&channel=fs&q=%22logic%22+%22syntaxes%22&oq=%22logic%22+%22syntaxes%22&gs_l=serp.3...50039.50039.0.50456.1.1.0.0.0.0.414.414.4-1.1.0....0...1c.1.64.serp..0.0.0.tjuxAmY7FbU https://www.google.com/search?client=ubuntu&channel=fs&q=%22logic%22+alternative+%22syntaxes%22&oq=%22logic%22+alternative+%22syntaxes%22&gs_l=serp.12...26584.28757.0.38089.12.5.0.0.0.0.408.408.4-1.1.0....0...1c.1.64.serp..11.0.0.yKhJjLDlHVs https://www.google.com/search?client=ubuntu&channel=fs&q=%22logical%22++%22syntaxes%22&oq=%22logical%22++%22syntaxes%22&gs_l=serp.3...8743.8743.0.8862.1.1.0.0.0.0.92.92.1.1.0....0...1c.1.64.serp..0.0.0.ByLgOzaUrQo ---

in the notes below i've 'transcribed' many of the symbols into ASCII lookalikes or madeup latex symbols:

https://en.wikipedia.org/wiki/List_of_logic_symbols N3 N3 FOL second order logic OWL, OWL2 ACE (attempo controlled english) http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.433.6462&rep=rep1&type=pdf CL dunno if these are any good, they just came up in the search. i think i took some notes on the CL (common logic) elsewhere http://www.obitko.com/tutorials/ontologies-semantic-web/common-logic.html http://www.jfsowa.com/talks/clprop.htm triples modal logic https://en.wikipedia.org/wiki/Description_logic https://en.wikipedia.org/wiki/Template:Non-classical_logic https://en.wikipedia.org/wiki/Category:Knowledge_representation_languages http://www-personal.umich.edu/~jlawler/IELL-LogicalNotation.pdf (skimmed it, good intro but nothing new in here for me) set-builder notatation http://math.stackexchange.com/questions/926873/logical-notations-for-mathematical-statements (just an example problem, but happens to be a good example) https://www.reddit.com/r/askphilosophy/comments/3gi4ou/why_do_we_have_so_many_different_logical/ http://www.ttu.ee/public/s/sotsiaalteaduskond/Instituudid/avaliku_halduse/Notation_-_Book_of_Abstracts.pdf http://www.loria.fr/~roegel/cours/symboles-logiques-eng.pdf skimmed it, only thing of interest for me here is his choice of concepts to study: no or and if/then forall, exists, iff Graphics and Languages For the Flexible Modular Framework John F. Sowa mb "the infinite variety of logics" by sowa, but it doesn't seem to be available on the web for free, so i'm inclined to skip it and read his later, available work only later: i skimmed it, no need to read further it looks great actually, but not for this purpose (notation), but rather for my understanding of AI. Skimming it in Google Books seems like enough, though. I particularly like his description of Petri Nets, i feel like i finally understand what they're good for. The intro also helped me get remember my foundations of logic history, and the following quote from the intro provides pointers to some stuff that i still have to learn, although it sounds like they are less intimidating than i thought: "For the principia, Russell adapted Frege's rules to the algebraic notation to create the most inefficient proof procedure that has ever been widely taught (but never widely learned). Better proof procedures for the algebraic notation were only invented many years later: natural deduction by Gerhard Gentzen (1935), semantic tableaux by Evert Beth (1955) and resolution by Alan Robinson (1965)." I didnt read the section in situation calculus but seeing the first paragraph of it helped me review and remember what i already learned about that, and the section on existential graphs helped me understand what they are (an alternate notation for FOL-type stuff; his 'beta' graphs are "isomorphic to first-order logic with identity, with all formulas closed;" [2]) mb The Age of Alternative Logics: Assessing Philosophy of Logic and Mathematics and: search in that book for the word "notation" eg A double-turnstile_G Phi[V, F_\Phi], where <A, Phi, V> is a game, F_\Phi is a strategy eg "if transitivity were not holding in my logic, then \square a \to \square \square would fail too (in my logic)" Suppose the standard logic is S4. We should use a notation to differentiate the modality which defines the standard logic...from the modalities which are used within the corresponding non-normal logic. Let us use "\triange" (resp. "\upsidedowntriangle") for necessity (resp. possibility)..." eg "..the real unit interval I := [0,1] \subseteq R with its naturl ordering, which will be referred to as the truth ordering, or \leq_T" eg V/\forall x in "...game correlated to \forall x(\phi_1 (V/\forall x)\phi_2), the verifier will choose a disjunct \phi+i not knowing the value of x. The introduction of the slash-notation at the syntactic level leads to a new logic, IF-FOL, which enables to tackle new patterns of mutual (in)dependence between quantifiers. A paradigmatic example is provided by the Henkin or branching quantifiers, such as:" -- Manuel Rebuschi see also https://en.wikipedia.org/wiki/Independence-friendly_logic eg "Similarly, let us compare 16.6 and 16.7: He didn't know anything had happened and neither did she (16.6) *He knew that nothing had happened and neither did she" (note: i didnt see a picture of the symbols so the transcription is missing those) eg "Particle rules for non-normal dialogics: The players may choose not only contexts they may also choose the propositional logic underlying the modalities in the chosen contexts: Or in the more formal notation of state of game (see appendix):" (note: i didnt see a picture of the symbols so the transcription is missing those) eg "...Then the ignorance de re of the same winning strategy can be accounted for in IF-EAL with the slash notation, applied either to the disjunction: G', 2 \yen Kv((y := a) winv(V/Kv)(2/ := 6)winv) (17.14) or to the union symbol: G',2JP Kv((i/ := a(U/Kv)y ..." (note: i didnt see a picture of the symbols so the transcription is probably misleading) https://en.wikipedia.org/wiki/Branching_quantifier https://en.wikipedia.org/wiki/Generalized_quantifier https://en.wikipedia.org/wiki/Dependence_logic substitution [v/x] the first few pages of "a topological correctness criterion for multiplicative non-commutative logic" https://en.wikipedia.org/wiki/Non-classical_logic https://en.wikipedia.org/wiki/Logical_connective#History_of_notations http://www.jfsowa.com/pubs/fflogic.htm http://staff.um.edu.mt/cabe2/lectures/webscience/docs/sowa_93.pdf http://cl.tamu.edu/docs/cl/SemTech2008/cl_sowa.pdf "the classic calculus of relations, according to Tarski 1941, consists of the Boolean calculus of classes with the addition of the relative (or Peircean) constants: 1' (or I for the identity relation), 0' (or T for the diversity relation), ^\circle (for the converse), \tallcrosswithlefttail (or \tallcross for relative addition) and ; (or, simply, contiguity for relative multiplication)" https://en.wikipedia.org/wiki/Bra%E2%80%93ket_notation mb write as <b

matrices single turnstile double turnstile set builder notation subset, superset http://www.wbricken.com/pdfs/01bm/03notate/visual/01syntactic-variety.pdf (skimmed it, probably not relevant, it's a logic paper about representations for the inference method in Spencer Brown's Laws Of Form) proj-plbook-plChProofLangs eg coq, isabelle, twelf https://www.google.com/search?client=ubuntu&channel=fs&q=%22higher-order+abstract+syntax+%28HOAS%22&ie=utf-8&oe=utf-8 WSML https://www.w3.org/2004/12/rules-ws/slides/josdebruijn.pdfhttp://www.lix.polytechnique.fr/Labo/Dale.Miller/papers/lics97.pdfrule markup languages and semantic web rule languages i havent read these yet, dunno if they are on-topic, they just have an interesting title: https://lists.w3.org/Archives/Public/www-rdf-logic/2001May/0191.htmlhttps://lists.w3.org/Archives/Public/www-rdf-logic/2001May/0183.htmlhttps://lists.w3.org/Archives/Public/www-rdf-logic/2001May/0188.html "We define an extension of predicate logic, called Binding Logic, where variables can be bound in terms and in propositions."
for bra and k> for ket and <bk> when they are together (if together was just <bk> then parser might complain when it sees k> with no < preceding it)

---

from Introduction: Logic and Logic programming languages, michael J O'Donnell:

" There are two quaternary relational notations from which all the other notations may be derived. Let Q be a question, T a set of formulae, D a proof, and F a formula. The ((quaternary answer relation)) notation

Q -> T

D - F ((read Q answers T hypothesizes D proves F))

means that in response to the question $Q$, given the postulates\subind{postulate} in $\mathbold{T}$, we may discover the proof $D$ of the answer $F$. Similarly, let I be an input, P a program, C a computation, and O an output. The notation ((quaternary computation relation)) notation

I \rightfacingtriangle P \tallrectangle C \to O ((read I inputs P computes C outputs O; latex is I\inputs P\computes C\outputs O))

means that in response to the input $I$, the program $P$ may perform the computation $C$, yielding output $O$. ... There are two closely related trinary notations.

Q ?- T

- F ((read Q answers T hyproves F)) ((trinary answer relation))

means that there exists a proof $D$ such that Q -> T

D - F , and

I \rightfacingtriangle P \tallrectangle\to O ((computed-output relation))

means that there exists a computation $C$ such that

I\inputs P\computes C\outputs O (above i wrote I \rightfacingtriangle P \tallrectangle C \to O)

...

Q ?- F

means that $F$ is an answer (not necessarily a correct one) to the question $Q$, and

T

D - F

means that $D$ is a proof of $F$, using postulates in the set $\mathbold{T}$

T

conventional concept from mathematical logic (often written ... T \singleturnstile_D F. The question-answer relation ?- is not conventional.
D - F is a

---

my thoughts on the above from Introduction: Logic and Logic programming languages, michael J O'Donnell: :

D - F
y A> z" would be acceptable for quaternary
- seems good for single turnstile,for double turnstile
-_D F seems better than T D - F to me

---

and k> such that <bk> means something special. Apparentlyx> means that x is a column vector, and <xmeans the conjugate transpose of x> (that is, a row vector whose elements are the conjugates of the elements of x); but the next section says that in some contexts, bras are more generally linear functionals on kets. Not sure if we want this notation; it seems to me that k> isn't any different from 'k', except that it provides the reader more information about what type k is
c d ... %> z"; but is that really necessary? just use "<%(a,b,c,d,...z)" in that case

---

"...Peirce discovered elegant rules of inference for ((existential graphs)) that are a generalization and simplification of Gentzen's rules of natural deduction. Peirce himself further generalized the rules to graph versions of modal logic and higher-order logic. It turns out that Peirce's rules are so general that any proof by resolution corresponds to the reverse of a proof by Peirce's form of natural deduction. Unfortunately, Peirce's paper..1906...attracted little or no attention...((Peirce's patterns)) can be adapted to many other notations...each adaptation requires only two definitions: how each operator affects positive and negative contexts; and what it means for one formula to be a generalization or specialization of another. In any notation, a positive context is enclosed in an even number of negations, including a context with no negations at all; and a negative context is enclosed in an odd number of negations. The count of negations includes the explicit ((negation)) operator, but it also includes the implicit negations inside the definitions of the other logical operators. Each of them affects the negation depth of its operands..."


so i guess Datomic uses 'Fluent's although it doesnt use that word?

" In artificial intelligence, a fluent is a condition that can change over time. In logical approaches to reasoning about actions, fluents can be represented in first-order logic by predicates having an argument that depends on time. For example, the condition “the box is on the table”, if it can change over time, cannot be represented by \mathrm{On}(\mathrm{box},\mathrm{table}); a third argument is necessary to the predicate \mathrm{On} to specify the time: \mathrm{On}(\mathrm{box},\mathrm{table},t) means that the box is on the table at time t. This representation of fluents is used in the situation calculus using the sequence of the past actions in place of the current time. "

seems like no one is talking about this: https://www.google.com/search?client=ubuntu&hs=jEe&channel=fs&q=%22datomic%22+fact+%22fluent%22+-%22fluent+api%22&oq=%22datomic%22+fact+%22fluent%22+-%22fluent+api%22&gs_l=serp.3...33178.36981.0.37091.14.9.0.0.0.0.0.0..0.0....0...1c.1.64.serp..14.0.0.VyvkU6PsrXQ

---

http://www.jfsowa.com/talks/clintro.pdf

IKL is the extension; it adds 'that':

" In CLIF notation, the operator 'that' applied to a CL or IKL sentence denotes the proposition stated by the sentence: (exists ((x1 Believe)) (and (Person Tom) (Expr x1 Tom) (Thme x1 (that (exists ((x2 Want) (s Situation)) (and (Person Mary) (Expr x2 Mary) (Thme x2 s) (Dscr s (that (exists ((x3 Marry) (x4 Sailor)) (and (Agnt x3 Mary) (Thme x3 x4) )))))))))) "

http://www.ihmc.us/users/phayes/IKL/SPEC/SPEC.html (see table at top) http://www.ihmc.us/users/phayes/ikl/guide/guide.html

---

https://en.wikipedia.org/wiki/List_of_constructed_languages#Knowledge_representation

---

Formal verification for re-entrant Solidity contracts using why3

https://gist.github.com/chriseth/c4a53f201cd17fc3dd5f8ddea2aa3ff9

---

'require' and 'ensure' might be better names for my 'promise' and 'demand'?

---

Animats 128 days ago [-]

The system: [1] The Nelson-Oppen decision procedure: [2]

There's a subset of of arithmetic and logic that's completely decidable. It includes addition, subtraction, multiplication by constants, "if", the theory of arrays, a similar theory of structures, and the Boolean operators (on bools, not integers.) This is enough for most subscript and overflow checks. Complete solvers exist for this subset. If you try to prove all constraints that need run-time checks with this approach, you can eliminate maybe 80-90% of run time checks as unnecessary.

Some checks cannot be eliminated, but can be hoisted out of loops. This means making one check before entering the loop, rather than one on each iteration. FOR statements usually can be optimized in this way. That's almost a no-brainer in modern compilers. The "Go" compiler does this. Heavier machinery is needed for more complex loops.

[1] http://www.animats.com/papers/verifier/verifiermanual.pdf

[2] http://www.cs.cmu.edu/~joshuad/NelsonOppen.pdf

---

to emphasize:

" Animats 128 days ago [-]

The system: [1] The Nelson-Oppen decision procedure: [2]

There's a subset of of arithmetic and logic that's completely decidable. It includes addition, subtraction, multiplication by constants, "if", the theory of arrays, a similar theory of structures, and the Boolean operators (on bools, not integers.) This is enough for most subscript and overflow checks. Complete solvers exist for this subset. If you try to prove all constraints that need run-time checks with this approach, you can eliminate maybe 80-90% of run time checks as unnecessary.

Some checks cannot be eliminated, but can be hoisted out of loops. This means making one check before entering the loop, rather than one on each iteration. FOR statements usually can be optimized in this way. That's almost a no-brainer in modern compilers. The "Go" compiler does this. Heavier machinery is needed for more complex loops.

[1] http://www.animats.com/papers/verifier/verifiermanual.pdf

[2] http://www.cs.cmu.edu/~joshuad/NelsonOppen.pdf "

links 2 abstract:

A method for combining decision procedures for several theories into a single decision procedure for their combination is described, and a simplifier based on this method is discussed. The simplifier finds a normal form for any expression formed from individual variables, the usual Boolean connectives, the equality predicate =, the conditional function if-then-else, the integers, the arithmetic functions and predicates +, -, and _<, the Lisp functions and predicates car, cdr, cons, and atom, the functions store and select for storing into and selecting from arrays, and uninterpreted function symbols. If the expression is a theorem it is simplified to the constant true, so the simplifier can be used as a decision procedure for the quantifier-free theory containing these functions and predicates. The simplifier is currently used in the Stanford Pascal Verifier. Key Words and Phrases: program verification, program manipulation, theorem proving, decidability, simplificatio

---

great looking talk (alread added to toreads):

---

saw someone else praising miniKanren again. E said it was more powerful than Prolog. Gotta learn it sometime.

---

ppl are talking about SMT solvers (how do they do vs SAT solvers? are they an elaboration of SAT solvers?)

---

skimmed a little bit more of the Why3 manual http://why3.lri.fr/download/manual-0.85.pdf Looks pretty cool, the kind of stuff we want.

Figrue 2.2 in manual version 0.85. where Order, SortedGen?, and SortedIntList? are defined, is interesting:

theory Order
  type t
  predicate (<=) t t
  axiom Le_refl : forall x : t. x <= x
  axiom Le_asym : forall x y : t. x <= y -> y <= x -> x = y
  axiom Le_trans: forall x y z : t. x <= y -> y <= z -> x <= z
end

theory SortedGen
  use import List
  clone import Order as O
  inductive sorted (l : list t) =
    | Sorted_Nil :
      sorted Nil
    | Sorted_One :
      forall x:t. sorted (Cons x Nil)
    | Sorted_Two :
      forall x y : t, l : list t.
      x <= y -> sorted (Cons y l) -> sorted (Cons x (Cons y l))
end

theory SortedIntList
  use import int.Int
  clone SortedGen with type O.t = int, predicate O.(<=) = (<=)
end

It contains examples of:

This appears to be similar to parametric polymorphism via templates (i think we could unify those things in Oot and have a slightly simpler syntax). Note that in the figure, "(<=)" refers to an arbitrary operator that is a template variable that is replaced, not actually <=.

Other notes; nearly manual text says:

---

http://logitext.mit.edu/main

---

staticassertion 3 hours ago [-]

I wouldn't use SEL4 as an argument. It is completely unreasonable to build a production kernel the way they did - it took them ages to get what they had. Google is probably looking to develop features/ move code a lot faster than is possible using their technique.

Good coding practices take no time to implement, you learn it once and you write that way. It is not bullet proof, neither are capabilities. Still, at least they're practical.

reply

nickpsecurity 2 hours ago [-]

How about Muen separation kernel in SPARK Ada with automated prover eliminating many classes of undefined behavior? Or C with tools like Frama-C and Astree Analyzer (like SPARK for C)? C++ can't reach the safety of its competition due to more complexity and less tooling.

reply

---

random discussion about prolog

https://news.ycombinator.com/item?id=14045987

---

example of using logic-programming to solve the n-queens problem. Embedded into type-level programming in (a GHC-flags-enabled variant of) Haskell:

https://aphyr.com/posts/342-typing-the-technical-interview

starts by defining peano arithmetic, basic boolean functions, map and filter, cons-based lists, any, listconcat, apply (partial function application), conj (conj list x = cons x list).

---

http://www.cprover.org/cbmc/

reply

jepler 1 day ago [-]

cbmc doesn't execute C(++) programs, but rather it turns them into instances of the SAT problem, which it then passes off to a SAT prover to (hopefully) prove it. It's assert() (and assume() and nondet_xxx() which are not mentioned in the article) that have special powers in cbmc, but functions name prove_xxx() aren't special as far as I'm aware.

reply

---

copied to here from my pl book:

---

zmonx 6 hours ago [-]

I have two comments about the Prolog code:

First, the article claims: "the sudoku solver above does a brute force search", but that is specifically not the case. In fact, quite the opposite holds: The Prolog Sudoku formulation shown in the article uses a powerful technique known as Constraint Logic Programming over Finite Domains, which automatically performs pruning before and also during the search for solutions. This effectively eliminates large portions of the search space in practice, and degenerates to brute force only in those cases where almost nothing can be deduced from initial constraints. In the particular case of Sudoku, the pruning is especially effective and can in many cases eliminate the search entirely, since the initial constraints (hints) basically determine the whole solution.

Second, yes, it is easy to write an O(n!) search algorithm in Prolog. However, it is almost as easy to implement much more efficient search algorithms in Prolog. For example, here is Quicksort in Prolog:

    quicksort([])     --> [].
    quicksort([L|Ls]) -->
            { partition(Ls, L, Smalls, Bigs) },
            quicksort(Smalls), [L], quicksort(Bigs).

Note how natural the declarative description of "first the smaller elements, then the pivot element, then the bigger elements" is in Prolog. This only requires a suitable implementation of partition/4, which is very easy to implement in at most 7 lines of Prolog code.

reply

tomxor 8 hours ago [-]

Pruning is an optimisation though, not a computational reduction... the remaining computation is still brute force. While it may be effective with low n, the complexity will still grow in roughly the same way as n increases - for this reason I would still call it brute force.

Isn't this why sudoku is still considered NP hard? (unless there has been a recent breakthrough).

reply

chubot 3 hours ago [-]

Eh, I'm pretty sure that (((Prolog quicksort example code))) suffers from the problem noted here:

https://news.ycombinator.com/item?id=13375108

(((the problem is that non-in-place quicksort has worse time complexity due to copying and garbage collection)))

See the "genuine sieve of eratosthenes" paper, and also:

http://augustss.blogspot.com/2007/08/quicksort-in-haskell-qu...

reply

sriku 2 hours ago [-]

FD constraint programming is powerful. Mozart/Oz goes one step further by making the search strategies programmable within the language using the idea of computational spaces.

I think two of the OP's notes are covered in Mozart/Oz - concurrent programming using data floor variables (roughly equivalent to promises) and logic programming, including finite domain constraints.

reply

 zmonx 2 hours ago [-]

I agree, with only one small additional remark: In general, a programmable search strategy sacrifices termination guarantees for more generality. That's a completely defensible approach of course, and not an argument against this path of development! Still, it comes at a price, and researchers working in termination analysis may prefer precisely the tighter reins and more stringent resulting properties we obtain when we eschew programmable strategies in specific parts of the language.

For finite domain constraints, such guarantees are a major attraction in the Prolog community: If setting up the constraints terminates, then setting up the constraints plus search also terminates. That's a rather valuable and strong guarantee (since it is the search that can take take infeasibly long more often than setting up the constraints), and it breaks down if the search itself is more generally programmable.

reply

---

" Pre- and Post-conditions

Clojure does allow you to define pre- and post-condition functions for assertions on inputs and output, but they don’t seem very common in practice and they only work at runtime.

(defn product-str [& nums] {:pre [(every? number? nums)] ;; every input must be a number :post [(string? %)]} ;; output must be a string (str (apply * nums)))

(product-str 1 1/2 3.0 4 5)

> "30.0"

(product-str 1 2 "3")

throws "CompilerException? java.lang.AssertionError?
Assert failed: (every? number? nums), compiling ..."

I suspect clojure.spec’s function instrumentation would mostly supercede pre- and post-conditions, but I’m just guessing. It’s possible to use them together. " [3]

---

https://www.imperialviolet.org/2014/09/07/provers.html

---

nickpsecurity 59 days ago [-]

I like this article. I want to correct something on C and C++ side.

He hints it's barely possible to write correct C. He doesnt mention how safety-critical sector, esp regulated, cranks out reliable C on a regular basis. They use C subsets with careful coding, reviews, and every automated tool they can. Static analyzers, automated testing, you name it. There's also analyzers that, when they work, can prove absence of entire classes of errors. RV-Match, Astree Analyzer (used by Airbus), and Frama-C are examples. CompCert? eliminates compiler errors afterward. Most security-critical development doesnt use tools like these even when companies can afford them. It's apathy, not technological capability.

Now, C++ is special given it was designed to reduce risks of using C. Yet, almost all the verification tooling was for C and Java. C++ is just so complicated that it's hard to statically analyze or even build tools for. That means a C program done like above might have fewer errors than a C++ program. For that reason, I recommend avoiding C++ in favor of either C or safe systems languages compiling to C to take advantage of its verification tooling. They're getting hit with these vulnerabilities because they using a harder-to-analyze language possibly with coding strategy that adds difficulty. See Cleanroom Software Engineering or Praxis's Correct-by-Construction for examples of low-defect, development strategy.

---

https://hackernoon.com/improve-smart-contract-security-by-formal-proofs-c0b377288e2e

---

https://ucsd-progsys.github.io/liquidhaskell-blog/

---

https://ucsd-progsys.github.io/liquidhaskell-blog/

---

might be interesting, just as a demo of using a language-integrated theorem prover to prove the correctness of a program https://blog.adacore.com/i-cant-believe-that-i-can-prove-that-it-can-sort

---

some interesting footnotes: " Like F*, Dafny, Why3, and Chalice. Each takes a different and interesting approach to bridging the verification gap.

    The only marginally used language that I know about with a full formal semantics is Standard ML. Worth mentioning, WebAssembly does too! If others are out there, I’d love to know about them.

" -- [4]

---

languages defined by term-rewriting

eg define a derivative operator "derivative" and then just define "derivative(x^2, x) = 2*x"

---

https://alloytools.org/alloy6.html

---

"agentultra" seems to capture what seems to me to be the consensus here as of 2022 (formal methods are getting easier but are not easy enuf yet): https://news.ycombinator.com/item?id=31545290

i should email him and ask what's the latest/what i should learn first. Here's some stuff i heard of, not sure if these are even the same type of thing at all:

tla+ idris F* dafny Why3 lean mercury alloy agda coq

if these are different types of things, one or more recommendations in each category are welcome. I can imagine there might be at least three categories in there, though i'm not sure: formal methods for proving correctness of programs, formal methods for writing down and checking general mathematical proofs, languages for expressing computation as logic programming.

my goal is just to learn about what's out there, both in terms of what's the most useful tool for each purpose, and in terms of what's the most "clean" approach (even if existing implementations/libraries are lacking).

---

the upshot of this is that requires/ensures are the crucial things to support: [5]

---

~ Corbin 6 hours ago

link flag

Two other options for embedding logic programming are miniKanren and CHR. These embeddings have very different behavior; miniKanren is shaped like the author’s desired system, but CHR is more like a tuple space or embedded database which can be queried and updated alongside imperative actions.

    ~
    dkl 2 hours ago | link | flag | 

CHR is a really interesting system and I wish there were more literature or examples of usage. I used it once to complete an interesting “missing data” puzzle, and I started a Sudoku solver with it. It seems to be a nice fit when you have a morass of data which you want to sort of massage into information until things coalesce and you find a solution.

http://minikanren.org/ https://en.wikipedia.org/wiki/Constraint_Handling_Rules ---

"I had never heard of the assume keyword before this. It is a mechanism by which the C++ programmer can indicate that an arbitrarily complex expression always evaluates to true." "The __builtin_unreachable extension: An C++23 alternative" -- https://developers.redhat.com/articles/2023/02/02/support-c23-assume-feature-gcc#how_ranger_improves_optimization

---