swish/commit
New files from upstream
author | Jan Wielemaker |
---|---|
Thu May 3 12:02:49 2018 +0200 | |
committer | Jan Wielemaker |
Thu May 3 12:04:06 2018 +0200 | |
commit | 2aec03dcccddb259fed1dd84ac8915ca6ad2057d |
tree | 4aee4d4e27488bc9b26b0cde30d78e75e58ef91e |
parent | 3201945dc41b4a001dd0874d9b7adbf1d7df4f2a |
Diff style: patch stat
diff --git a/client/sin-table.html b/client/sin-table.html index a6d2d39..a9e3b86 100644 --- a/client/sin-table.html +++ b/client/sin-table.html @@ -2,11 +2,12 @@ <html> <head> + <meta charset="UTF-8"> <title>Create sine table using Pengines</title> - <script src="http://swish.swi-prolog.org/pengine/pengines.js" + <script src="https://code.jquery.com/jquery-2.1.3.min.js" type="text/javascript"> </script> - <script src="https://code.jquery.com/jquery-2.1.3.min.js" + <script src="https://swish.swi-prolog.org/pengine/pengines.js" type="text/javascript"> </script> </head> @@ -17,9 +18,9 @@ <p> The table below is filled by running a query against the main SWISH server at <a -href="http://swish.swi-prolog.org">http://swish.swi-prolog.org</a>, +href="https://swish.swi-prolog.org">https://swish.swi-prolog.org</a>, using the saved script <a -href="http://swish.swi-prolog.org/p/sin_table.pl">sin_table.pl</a>. Note that this +href="https://swish.swi-prolog.org/p/sin_table.pl">sin_table.pl</a>. Note that this example illustrates that you can write interactive web applications against one or more Pengine enabled Prolog servers without having direct access to Prolog. @@ -41,7 +42,7 @@ and ask for more results if there are more available. */ $(function() { - new Pengine({ server: "http://swish.swi-prolog.org/pengine", + new Pengine({ server: "https://swish.swi-prolog.org/pengine", ask: "sin_table(X,Y)", chunk: 1000, application: "swish", diff --git a/examples/swish_tutorials.swinb b/examples/swish_tutorials.swinb index 1ca0b81..185a4d1 100644 --- a/examples/swish_tutorials.swinb +++ b/examples/swish_tutorials.swinb @@ -1,16 +1,17 @@ <div class="notebook"> -<div class="nb-cell markdown"> +<div class="nb-cell markdown" name="md1"> # SWISH Tutorials This notebook provides an overview of tutorials about using SWISH. - [Rendering answers graphically](example/rendering.swinb) - [Using HTML cells in notebooks](example/htmlcell.swinb) + - [Accessing external data](example/data_source.swinb) - [Access the SWISH interface from Prolog](example/jquery.swinb) </div> -<div class="nb-cell markdown"> +<div class="nb-cell markdown" name="md2"> ## Embedded R support The [R project](https://www.r-project.org/) provides statistical computing and data vizualization. SWISH can access R through [Rserve](https://rforge.net/Rserve/). The *prototype* client for Rserve is available as the _pack_ [rserve_client](http://www.swi-prolog.org/pack/list?p=rserve_client). The GitHub repository [rserve-sandbox](https://github.com/JanWielemaker/rserve-sandbox) provides the matching Rserve server as a [Docker](https://www.docker.com/) specification. @@ -22,11 +23,11 @@ The notebooks below explain the basics of using R from SWISH. You can test whet - [Downloading (graphics) files](example/Rdownload.swinb) </div> -<div class="nb-cell query"> +<div class="nb-cell query" name="q1"> <- 'R.Version'(). </div> -<div class="nb-cell markdown"> +<div class="nb-cell markdown" name="md3"> ### More elaborate examples - [EM Clustering of the Iris Dataset](example/iris.swinb) diff --git a/lib/swish/authenticate.pl b/lib/swish/authenticate.pl index a70c800..271318a 100644 --- a/lib/swish/authenticate.pl +++ b/lib/swish/authenticate.pl @@ -144,6 +144,7 @@ current_user_property(identity(_Atom), dict). current_user_property(external_identity(_String), dict). current_user_property(identity_provider(_Atom), dict). current_user_property(profile_id(_Atom), dict). +current_user_property(avatar(_String), dict). current_user_property(login(_IdProvider), derived). current_user_property(name(_Name), broadcast). diff --git a/lib/swish/avatar.pl b/lib/swish/avatar.pl index cacfba7..b531913 100644 --- a/lib/swish/avatar.pl +++ b/lib/swish/avatar.pl @@ -64,7 +64,7 @@ email_gravatar(Email, AvatarURL) :- downcase_atom(Email, CanonicalEmail), md5_hash(CanonicalEmail, Hash, []), atom_concat('/avatar/', Hash, Path), - uri_data(scheme, Components, http), + uri_data(scheme, Components, https), uri_data(authority, Components, 'www.gravatar.com'), uri_data(path, Components, Path), uri_components(AvatarURL, Components). @@ -72,15 +72,32 @@ email_gravatar(Email, AvatarURL) :- %% valid_gravatar(+URL) is semidet. % -% True if URL is a real gavatar. +% True if URL is a real gravatar. We cache results for 300 +% seconds. + +:- dynamic + gravatar_tested/3. % URL, Time, Result +valid_gravatar(URL) :- + gravatar_tested(URL, Time, Result), + get_time(Now), + ( Now - Time < 300 + -> !, + Result == true + ; retractall(gravatar_tested(URL,_,_)) + ). valid_gravatar(URL) :- string_concat(URL, "?d=404", URL2), - catch(http_open(URL2, In, [method(head)]), - error(existence_error(_,_),_), - fail), - close(In). - + ( catch(http_open(URL2, In, [method(head)]), + error(_,_), + fail) + -> close(In), + Result = true + ; Result = false + ), + get_time(Now), + asserta(gravatar_tested(URL, Now, Result)), + Result == true. %% random_avatar(-AvatarURL) is det. % diff --git a/lib/swish/bad-words-google.txt b/lib/swish/bad-words-google.txt new file mode 100644 index 0000000..b2ea5d4 --- /dev/null +++ b/lib/swish/bad-words-google.txt @@ -0,0 +1,550 @@ +4r5e +50 yard cunt punt +5h1t +5hit +a_s_s +a2m +a55 +adult +amateur +anal +anal impaler +anal leakage +anilingus +anus +ar5e +arrse +arse +arsehole +ass +ass fuck +asses +assfucker +ass-fucker +assfukka +asshole +asshole +assholes +assmucus +assmunch +asswhole +autoerotic +b!tch +b00bs +b17ch +b1tch +ballbag +ballsack +bang (one's) box +bangbros +bareback +bastard +beastial +beastiality +beef curtain +bellend +bestial +bestiality +bi+ch +biatch +bimbos +birdlock +bitch +bitch tit +bitcher +bitchers +bitches +bitchin +bitching +bloody +blow job +blow me +blow mud +blowjob +blowjobs +blue waffle +blumpkin +boiolas +bollock +bollok +boner +boob +boobs +booobs +boooobs +booooobs +booooooobs +breasts +buceta +bugger +bum +bunny fucker +bust a load +busty +butt +butt fuck +butthole +buttmuch +buttplug +c0ck +c0cksucker +carpet muncher +carpetmuncher +cawk +chink +choade +chota bags +cipa +cl1t +clit +clit licker +clitoris +clits +clitty litter +clusterfuck +cnut +cock +cock pocket +cock snot +cockface +cockhead +cockmunch +cockmuncher +cocks +cocksuck +cocksucked +cocksucker +cock-sucker +cocksucking +cocksucks +cocksuka +cocksukka +cok +cokmuncher +coksucka +coon +cop some wood +cornhole +corp whore +cox +cum +cum chugger +cum dumpster +cum freak +cum guzzler +cumdump +cummer +cumming +cums +cumshot +cunilingus +cunillingus +cunnilingus +cunt +cunt hair +cuntbag +cuntlick +cuntlicker +cuntlicking +cunts +cuntsicle +cunt-struck +cut rope +cyalis +cyberfuc +cyberfuck +cyberfucked +cyberfucker +cyberfuckers +cyberfucking +d1ck +damn +dick +dick hole +dick shy +dickhead +dildo +dildos +dink +dinks +dirsa +dirty Sanchez +dlck +dog-fucker +doggie style +doggiestyle +doggin +dogging +donkeyribber +doosh +duche +dyke +eat a dick +eat hair pie +ejaculate +ejaculated +ejaculates +ejaculating +ejaculatings +ejaculation +ejakulate +erotic +f u c k +f u c k e r +f_u_c_k +f4nny +facial +fag +fagging +faggitt +faggot +faggs +fagot +fagots +fags +fanny +fannyflaps +fannyfucker +fanyy +fatass +fcuk +fcuker +fcuking +feck +fecker +felching +fellate +fellatio +fingerfuck +fingerfucked +fingerfucker +fingerfuckers +fingerfucking +fingerfucks +fist fuck +fistfuck +fistfucked +fistfucker +fistfuckers +fistfucking +fistfuckings +fistfucks +flange +flog the log +fook +fooker +fuck hole +fuck puppet +fuck trophy +fuck yo mama +fuck +fucka +fuck-ass +fuck-bitch +fucked +fucker +fuckers +fuckhead +fuckheads +fuckin +fucking +fuckings +fuckingshitmotherfucker +fuckme +fuckmeat +fucks +fucktoy +fuckwhit +fuckwit +fudge packer +fudgepacker +fuk +fuker +fukker +fukkin +fuks +fukwhit +fukwit +fux +fux0r +gangbang +gangbang +gang-bang +gangbanged +gangbangs +gassy ass +gaylord +gaysex +goatse +god +god damn +god-dam +goddamn +goddamned +god-damned +ham flap +hardcoresex +hell +heshe +hoar +hoare +hoer +homo +homoerotic +hore +horniest +horny +hotsex +how to kill +how to murdep +jackoff +jack-off +jap +jerk +jerk-off +jism +jiz +jizm +jizz +kawk +kinky Jesus +knob +knob end +knobead +knobed +knobend +knobend +knobhead +knobjocky +knobjokey +kock +kondum +kondums +kum +kummer +kumming +kums +kunilingus +kwif +l3i+ch +l3itch +labia +LEN +lmao +lmfao +lmfao +lust +lusting +m0f0 +m0fo +m45terbate +ma5terb8 +ma5terbate +mafugly +masochist +masterb8 +masterbat* +masterbat3 +masterbate +master-bate +masterbation +masterbations +masturbate +mof0 +mofo +mo-fo +mothafuck +mothafucka +mothafuckas +mothafuckaz +mothafucked +mothafucker +mothafuckers +mothafuckin +mothafucking +mothafuckings +mothafucks +mother fucker +mother fucker +motherfuck +motherfucked +motherfucker +motherfuckers +motherfuckin +motherfucking +motherfuckings +motherfuckka +motherfucks +muff +muff puff +mutha +muthafecker +muthafuckker +muther +mutherfucker +n1gga +n1gger +nazi +need the dick +nigg3r +nigg4h +nigga +niggah +niggas +niggaz +nigger +niggers +nob +nob jokey +nobhead +nobjocky +nobjokey +numbnuts +nut butter +nutsack +omg +orgasim +orgasims +orgasm +orgasms +p0rn +pawn +pecker +penis +penisfucker +phonesex +phuck +phuk +phuked +phuking +phukked +phukking +phuks +phuq +pigfucker +pimpis +piss +pissed +pisser +pissers +pisses +pissflaps +pissin +pissing +pissoff +poop +porn +porno +pornography +pornos +prick +pricks +pron +pube +pusse +pussi +pussies +pussy +pussy fart +pussy palace +pussys +queaf +queer +rectum +retard +rimjaw +rimming +s hit +s.o.b. +s_h_i_t +sadism +sadist +sandbar +sausage queen +schlong +screwing +scroat +scrote +scrotum +semen +sex +sh!+ +sh!t +sh1t +shag +shagger +shaggin +shagging +shemale +shi+ +shit +shit fucker +shitdick +shite +shited +shitey +shitfuck +shitfull +shithead +shiting +shitings +shits +shitted +shitter +shitters +shitting +shittings +shitty +skank +slope +slut +slut bucket +sluts +smegma +smut +snatch +son-of-a-bitch +spac +spunk +t1tt1e5 +t1tties +teets +teez +testical +testicle +tit +tit wank +titfuck +tits +titt +tittie5 +tittiefucker +titties +tittyfuck +tittywank +titwank +tosser +turd +tw4t +twat +twathead +twatty +twunt +twunter +v14gra +v1gra +vagina +viagra +vulva +w00se +wang +wank +wanker +wanky +whoar +whore +willies +willy +wtf +xrated +xxx \ No newline at end of file diff --git a/lib/swish/bad-words.txt b/lib/swish/bad-words.txt new file mode 100644 index 0000000..5013fea --- /dev/null +++ b/lib/swish/bad-words.txt @@ -0,0 +1,1384 @@ + +abbo +abo +abortion +abuse +addict +addicts +adult +africa +african +alla +allah +alligatorbait +amateur +american +anal +analannie +analsex +angie +angry +anus +arab +arabs +areola +argie +aroused +arse +arsehole +asian +ass +assassin +assassinate +assassination +assault +assbagger +assblaster +assclown +asscowboy +asses +assfuck +assfucker +asshat +asshole +assholes +asshore +assjockey +asskiss +asskisser +assklown +asslick +asslicker +asslover +assman +assmonkey +assmunch +assmuncher +asspacker +asspirate +asspuppies +assranger +asswhore +asswipe +athletesfoot +attack +australian +babe +babies +backdoor +backdoorman +backseat +badfuck +balllicker +balls +ballsack +banging +baptist +barelylegal +barf +barface +barfface +bast +bastard +bazongas +bazooms +beaner +beast +beastality +beastial +beastiality +beatoff +beat-off +beatyourmeat +beaver +bestial +bestiality +bi +biatch +bible +bicurious +bigass +bigbastard +bigbutt +bigger +bisexual +bi-sexual +bitch +bitcher +bitches +bitchez +bitchin +bitching +bitchslap +bitchy +biteme +black +blackman +blackout +blacks +blind +blow +blowjob +boang +bogan +bohunk +bollick +bollock +bomb +bombers +bombing +bombs +bomd +bondage +boner +bong +boob +boobies +boobs +booby +boody +boom +boong +boonga +boonie +booty +bootycall +bountybar +bra +brea5t +breast +breastjob +breastlover +breastman +brothel +bugger +buggered +buggery +bullcrap +bulldike +bulldyke +bullshit +bumblefuck +bumfuck +bunga +bunghole +buried +burn +butchbabes +butchdike +butchdyke +butt +buttbang +butt-bang +buttface +buttfuck +butt-fuck +buttfucker +butt-fucker +buttfuckers +butt-fuckers +butthead +buttman +buttmunch +buttmuncher +buttpirate +buttplug +buttstain +byatch +cacker +cameljockey +cameltoe +canadian +cancer +carpetmuncher +carruth +catholic +catholics +cemetery +chav +cherrypopper +chickslick +children's +chin +chinaman +chinamen +chinese +chink +chinky +choad +chode +christ +christian +church +cigarette +cigs +clamdigger +clamdiver +clit +clitoris +clogwog +cocaine +cock +cockblock +cockblocker +cockcowboy +cockfight +cockhead +cockknob +cocklicker +cocklover +cocknob +cockqueen +cockrider +cocksman +cocksmith +cocksmoker +cocksucer +cocksuck +cocksucked +cocksucker +cocksucking +cocktail +cocktease +cocky +cohee +coitus +color +colored +coloured +commie +communist +condom +conservative +conspiracy +coolie +cooly +coon +coondog +copulate +cornhole +corruption +cra5h +crabs +crack +crackpipe +crackwhore +crack-whore +crap +crapola +crapper +crappy +crash +creamy +crime +crimes +criminal +criminals +crotch +crotchjockey +crotchmonkey +crotchrot +cum +cumbubble +cumfest +cumjockey +cumm +cummer +cumming +cumquat +cumqueen +cumshot +cunilingus +cunillingus +cunn +cunnilingus +cunntt +cunt +cunteyed +cuntfuck +cuntfucker +cuntlick +cuntlicker +cuntlicking +cuntsucker +cybersex +cyberslimer +dago +dahmer +dammit +damn +damnation +damnit +darkie +darky +datnigga +dead +deapthroat +death +deepthroat +defecate +dego +demon +deposit +desire +destroy +deth +devil +devilworshipper +dick +dickbrain +dickforbrains +dickhead +dickless +dicklick +dicklicker +dickman +dickwad +dickweed +diddle +die +died +dies +dike +dildo +dingleberry +dink +dipshit +dipstick +dirty +disease +diseases +disturbed +dive +dix +dixiedike +dixiedyke +doggiestyle +doggystyle +dong +doodoo +doo-doo +doom +dope +dragqueen +dragqween +dripdick +drug +drunk +drunken +dumb +dumbass +dumbbitch +dumbfuck +dyefly +dyke +easyslut +eatballs +eatme +eatpussy +ecstacy +ejaculate +ejaculated +ejaculating +ejaculation +enema +enemy +erect +erection +ero +escort +ethiopian +ethnic +european +evl +excrement +execute +executed +execution +executioner +explosion +facefucker +faeces +fag +fagging +faggot +fagot +failed +failure +fairies +fairy +faith +fannyfucker +fart +farted +farting +farty +fastfuck +fat +fatah +fatass +fatfuck +fatfucker +fatso +fckcum +fear +feces +felatio +felch +felcher +felching +fellatio +feltch +feltcher +feltching +fetish +fight +filipina +filipino +fingerfood +fingerfuck +fingerfucked +fingerfucker +fingerfuckers +fingerfucking +fire +firing +fister +fistfuck +fistfucked +fistfucker +fistfucking +fisting +flange +flasher +flatulence +floo +flydie +flydye +fok +fondle +footaction +footfuck +footfucker +footlicker +footstar +fore +foreskin +forni +fornicate +foursome +fourtwenty +fraud +freakfuck +freakyfucker +freefuck +fu +fubar +fuc +fucck +fuck +fucka +fuckable +fuckbag +fuckbuddy +fucked +fuckedup +fucker +fuckers +fuckface +fuckfest +fuckfreak +fuckfriend +fuckhead +fuckher +fuckin +fuckina +fucking +fuckingbitch +fuckinnuts +fuckinright +fuckit +fuckknob +fuckme +fuckmehard +fuckmonkey +fuckoff +fuckpig +fucks +fucktard +fuckwhore +fuckyou +fudgepacker +fugly +fuk +fuks +funeral +funfuck +fungus +fuuck +gangbang +gangbanged +gangbanger +gangsta +gatorbait +gay +gaymuthafuckinwhore +gaysex +geez +geezer +geni +genital +german +getiton +gin +ginzo +gipp +girls +givehead +glazeddonut +gob +god +godammit +goddamit +goddammit +goddamn +goddamned +goddamnes +goddamnit +goddamnmuthafucker +goldenshower +gonorrehea +gonzagas +gook +gotohell +goy +goyim +greaseball +gringo +groe +gross +grostulation +gubba +gummer +gun +gyp +gypo +gypp +gyppie +gyppo +gyppy +hamas +handjob +hapa +harder +hardon +harem +headfuck +headlights +hebe +heeb +hell +henhouse +heroin +herpes +heterosexual +hijack +hijacker +hijacking +hillbillies +hindoo +hiscock +hitler +hitlerism +hitlerist +hiv +ho +hobo +hodgie +hoes +hole +holestuffer +homicide +homo +homobangers +homosexual +honger +honk +honkers +honkey +honky +hook +hooker +hookers +hooters +hore +hork +horn +horney +horniest +horny +horseshit +hosejob +hoser +hostage +hotdamn +hotpussy +hottotrot +hummer +husky +hussy +hustler +hymen +hymie +iblowu +idiot +ikey +illegal +incest +insest +intercourse +interracial +intheass +inthebuff +israel +israeli +israel's +italiano +itch +jackass +jackoff +jackshit +jacktheripper +jade +jap +japanese +japcrap +jebus +jeez +jerkoff +jesus +jesuschrist +jew +jewish +jiga +jigaboo +jigg +jigga +jiggabo +jigger +jiggy +jihad +jijjiboo +jimfish +jism +jiz +jizim +jizjuice +jizm +jizz +jizzim +jizzum +joint +juggalo +jugs +junglebunny +kaffer +kaffir +kaffre +kafir +kanake +kid +kigger +kike +kill +killed +killer +killing +kills +kink +kinky +kissass +kkk +knife +knockers +kock +kondum +koon +kotex +krap +krappy +kraut +kum +kumbubble +kumbullbe +kummer +kumming +kumquat +kums +kunilingus +kunnilingus +kunt +ky +kyke +lactate +laid +lapdance +latin +lesbain +lesbayn +lesbian +lesbin +lesbo +lez +lezbe +lezbefriends +lezbo +lezz +lezzo +liberal +libido +licker +lickme +lies +limey +limpdick +limy +lingerie +liquor +livesex +loadedgun +lolita +looser +loser +lotion +lovebone +lovegoo +lovegun +lovejuice +lovemuscle +lovepistol +loverocket +lowlife +lsd +lubejob +lucifer +luckycammeltoe +lugan +lynch +macaca +mad +mafia +magicwand +mams +manhater +manpaste +marijuana +mastabate +mastabater +masterbate +masterblaster +mastrabator +masturbate +masturbating +mattressprincess +meatbeatter +meatrack +meth +mexican +mgger +mggor +mickeyfinn +mideast +milf +minority +mockey +mockie +mocky +mofo +moky +moles +molest +molestation +molester +molestor +moneyshot +mooncricket +mormon +moron +moslem +mosshead +mothafuck +mothafucka +mothafuckaz +mothafucked +mothafucker +mothafuckin +mothafucking +mothafuckings +motherfuck +motherfucked +motherfucker +motherfuckin +motherfucking +motherfuckings +motherlovebone +muff +muffdive +muffdiver +muffindiver +mufflikcer +mulatto +muncher +munt +murder +murderer +muslim +naked +narcotic +nasty +nastybitch +nastyho +nastyslut +nastywhore +nazi +necro +negro +negroes +negroid +negro's +nig +niger +nigerian +nigerians +nigg +nigga +niggah +niggaracci +niggard +niggarded +niggarding +niggardliness +niggardliness's +niggardly +niggards +niggard's +niggaz +nigger +niggerhead +niggerhole +niggers +nigger's +niggle +niggled +niggles +niggling +nigglings +niggor +niggur +niglet +nignog +nigr +nigra +nigre +nip +nipple +nipplering +nittit +nlgger +nlggor +nofuckingway +nook +nookey +nookie +noonan +nooner +nude +nudger +nuke +nutfucker +nymph +ontherag +oral +orga +orgasim +orgasm +orgies +orgy +osama +paki +palesimian +palestinian +pansies +pansy +panti +panties +payo +pearlnecklace +peck +pecker +peckerwood +pee +peehole +pee-pee +peepshow +peepshpw +pendy +penetration +peni5 +penile +penis +penises +penthouse +period +perv +phonesex +phuk +phuked +phuking +phukked +phukking +phungky +phuq +pi55 +picaninny +piccaninny +pickaninny +piker +pikey +piky +pimp +pimped +pimper +pimpjuic +pimpjuice +pimpsimp +pindick +piss +pissed +pisser +pisses +pisshead +pissin +pissing +pissoff +pistol +pixie +pixy +playboy +playgirl +pocha +pocho +pocketpool +pohm +polack +pom +pommie +pommy +poo +poon +poontang +poop +pooper +pooperscooper +pooping +poorwhitetrash +popimp +porchmonkey +porn +pornflick +pornking +porno +pornography +pornprincess +pot +poverty +premature +pric +prick +prickhead +primetime +propaganda +pros +prostitute +protestant +pu55i +pu55y +pube +pubic +pubiclice +pud +pudboy +pudd +puddboy +puke +puntang +purinapricness +puss +pussie +pussies +pussy +pussycat +pussyeater +pussyfucker +pussylicker +pussylips +pussylover +pussypounder +pusy +quashie +queef +queer +quickie +quim +ra8s +rabbi +racial +racist +radical +radicals +raghead +randy +rape +raped +raper +rapist +rearend +rearentry +rectum +redlight +redneck +reefer +reestie +refugee +reject +remains +rentafuck +republican +rere +retard +retarded +ribbed +rigger +rimjob +rimming +roach +robber +roundeye +rump +russki +russkie +sadis +sadom +samckdaddy +sandm +sandnigger +satan +scag +scallywag +scat +schlong +screw +screwyou +scrotum +scum +semen +seppo +servant +sex +sexed +sexfarm +sexhound +sexhouse +sexing +sexkitten +sexpot +sexslave +sextogo +sextoy +sextoys +sexual +sexually +sexwhore +sexy +sexymoma +sexy-slim +shag +shaggin +shagging +shat +shav +shawtypimp +sheeney +shhit +shinola +shit +shitcan +shitdick +shite +shiteater +shited +shitface +shitfaced +shitfit +shitforbrains +shitfuck +shitfucker +shitfull +shithapens +shithappens +shithead +shithouse +shiting +shitlist +shitola +shitoutofluck +shits +shitstain +shitted +shitter +shitting +shitty +shoot +shooting +shortfuck +showtime +sick +sissy +sixsixsix +sixtynine +sixtyniner +skank +skankbitch +skankfuck +skankwhore +skanky +skankybitch +skankywhore +skinflute +skum +skumbag +slant +slanteye +slapper +slaughter +slav +slave +slavedriver +sleezebag +sleezeball +slideitin +slime +slimeball +slimebucket +slopehead +slopey +slopy +slut +sluts +slutt +slutting +slutty +slutwear +slutwhore +smack +smackthemonkey +smut +snatch +snatchpatch +snigger +sniggered +sniggering +sniggers +snigger's +sniper +snot +snowback +snownigger +sob +sodom +sodomise +sodomite +sodomize +sodomy +sonofabitch +sonofbitch +sooty +sos +soviet +spaghettibender +spaghettinigger +spank +spankthemonkey +sperm +spermacide +spermbag +spermhearder +spermherder +spic +spick +spig +spigotty +spik +spit +spitter +splittail +spooge +spreadeagle +spunk +spunky +squaw +stagg +stiffy +strapon +stringer +stripclub +stroke +stroking +stupid +stupidfuck +stupidfucker +suck +suckdick +sucker +suckme +suckmyass +suckmydick +suckmytit +suckoff +suicide +swallow +swallower +swalow +swastika +sweetness +syphilis +taboo +taff +tampon +tang +tantra +tarbaby +tard +teat +terror +terrorist +teste +testicle +testicles +thicklips +thirdeye +thirdleg +threesome +threeway +timbernigger +tinkle +tit +titbitnipply +titfuck +titfucker +titfuckin +titjob +titlicker +titlover +tits +tittie +titties +titty +tnt +toilet +tongethruster +tongue +tonguethrust +tonguetramp +tortur +torture +tosser +towelhead +trailertrash +tramp +trannie +tranny +transexual +transsexual +transvestite +triplex +trisexual +trojan +trots +tuckahoe +tunneloflove +turd +turnon +twat +twink +twinkie +twobitwhore +uck +uk +unfuckable +upskirt +uptheass +upthebutt +urinary +urinate +urine +usama +uterus +vagina +vaginal +vatican +vibr +vibrater +vibrator +vietcong +violence +virgin +virginbreaker +vomit +vulva +wab +wank +wanker +wanking +waysted +weapon +weenie +weewee +welcher +welfare +wetb +wetback +wetspot +whacker +whash +whigger +whiskey +whiskeydick +whiskydick +whit +whitenigger +whites +whitetrash +whitey +whiz +whop +whore +whorefucker +whorehouse +wigger +willie +williewanker +willy +wn +wog +women's +wop +wtf +wuss +wuzzie +xtc +xxx +yankee +yellowman +zigabo +zipperhead diff --git a/lib/swish/chat.pl b/lib/swish/chat.pl index cc69e6b..1a54fe7 100644 --- a/lib/swish/chat.pl +++ b/lib/swish/chat.pl @@ -73,6 +73,7 @@ :- use_module(chatstore). :- use_module(authenticate). :- use_module(pep). +:- use_module(content_filter). :- html_meta(chat_to_profile(+, html)). @@ -938,24 +939,83 @@ json_message(Dict, WSID) :- json_message(Dict, WSID) :- _{type: "chat-message", docid:DocID} :< Dict, !, chat_add_user_id(WSID, Dict, Message), - ( ws_authorized(chat(post(Message, DocID)), Message.user) - -> chat_relay(Message) - ; chat_spam(Msg), - hub_send(WSID, json(json{type:forbidden, + ( forbidden(Message, DocID, Why) + -> hub_send(WSID, json(json{type:forbidden, action:chat_post, about:DocID, - message:Msg + message:Why })) + ; chat_relay(Message) ). json_message(Dict, _WSID) :- debug(chat(ignored), 'Ignoring JSON message ~p', [Dict]). -chat_spam("Due to frequent spamming we were forced to limit \c - posting chat messages to users who are logged in."). - dict_file_name(Dict, File) :- atom_string(File, Dict.get(file)). +%! forbidden(+Message, +DocID, -Why) is semidet. +% +% True if the chat Message about DocID must be forbidden, in which +% case Why is unified with a string indicating the reason. +% Currently: +% +% - Demands the user to be logged on +% - Limits the size of the message and its payloads + +forbidden(Message, DocID, Why) :- + \+ ws_authorized(chat(post(Message, DocID)), Message.user), !, + Why = "Due to frequent spamming we were forced to limit \c + posting chat messages to users who are logged in.". +forbidden(Message, _DocID, Why) :- + Text = Message.get(text), + string_length(Text, Len), + Len > 500, + Why = "Chat messages are limited to 500 characters". +forbidden(Message, _DocID, Why) :- + Payloads = Message.get(payload), + member(Payload, Payloads), + large_payload(Payload, Why), !. +forbidden(Message, _DocID, Why) :- + eval_content(Message.get(text), _WC, Score), + user_score(Message, Score, Cummulative, _Count), + Score*2 + Cummulative < 0, + !, + Why = "Chat messages must be in English and avoid offensive language". + +large_payload(Payload, Why) :- + Selections = Payload.get(selection), + member(Selection, Selections), + ( string_length(Selection.get(string), SelLen), SelLen > 500 + ; string_length(Selection.get(context), SelLen), SelLen > 500 + ), !, + Why = "Selection too long (max. 500 characters)". +large_payload(Payload, Why) :- + string_length(Payload.get(query), QLen), QLen > 1000, !, + Why = "Query too long (max. 1000 characters)". + +user_score(Message, MsgScore, Cummulative, Count) :- + Profile = Message.get(user).get(profile_id), !, + block(Profile, MsgScore, Cummulative, Count). +user_score(_, _, 0, 1). + +%! block(+User, +Score, -Cummulative, -Count) +% +% Keep a count and cummulative score for a user. + +:- dynamic + blocked/4. + +block(User, Score, Cummulative, Count) :- + blocked(User, Score0, Count0, Time), !, + get_time(Now), + Cummulative = Score0*(0.5**((Now-Time)/600)) + Score, + Count is Count0 + 1, + asserta(blocked(User, Cummulative, Count, Now)), + retractall(blocked(User, Score0, Count0, Time)). +block(User, Score, Score, 1) :- + get_time(Now), + asserta(blocked(User, Score, 1, Now)). + /******************************* * CHAT MESSAGES * diff --git a/lib/swish/chatstore.pl b/lib/swish/chatstore.pl index 60520ab..379cd93 100644 --- a/lib/swish/chatstore.pl +++ b/lib/swish/chatstore.pl @@ -58,12 +58,22 @@ :- multifile swish_config:chat_count_about/2. % +DocID, -Count -:- initialization open_chatstore. +:- listen(http(pre_server_start), + open_chatstore). :- dynamic storage_dir/1. :- volatile storage_dir/1. open_chatstore :- + storage_dir(_), + !. +open_chatstore :- + with_mutex(chat_store, open_chatstore_guarded). + +open_chatstore_guarded :- + storage_dir(_), + !. +open_chatstore_guarded :- setting(directory, Spec), absolute_file_name(Spec, Dir, [ file_type(directory), @@ -71,7 +81,7 @@ open_chatstore :- file_errors(fail) ]), !, asserta(storage_dir(Dir)). -open_chatstore :- +open_chatstore_guarded :- setting(directory, Spec), absolute_file_name(Spec, Dir, [ solutions(all) @@ -82,7 +92,12 @@ open_chatstore :- fail), !, asserta(storage_dir(Dir)). +%! chat_dir_file(+DocID, -Path, -File) +% +% True when Path/File is the place to store char messages about DocID. + chat_dir_file(DocID, Path, File) :- + open_chatstore, sha_hash(DocID, Bin, []), hash_atom(Bin, Hash), sub_atom(Hash, 0, 2, _, D1), @@ -92,10 +107,6 @@ chat_dir_file(DocID, Path, File) :- atomic_list_concat([Dir, D1, D2], /, Path), atomic_list_concat([Path, Name], /, File). -chat_file(DocID, File) :- - chat_dir_file(DocID, Dir, File), - make_directory_path(Dir). - %! existing_chat_file(+DocID, -File) is semidet. % % True when File is the path of the file holding chat messages from @@ -115,12 +126,13 @@ existing_chat_file(DocID, File) :- chat_store(Message) :- chat{docid:DocIDS} :< Message, atom_string(DocID, DocIDS), - chat_file(DocID, File), + chat_dir_file(DocID, Dir, File), ( del_dict(create, Message, false, Message1) -> exists_file(File) ; Message1 = Message ), !, + make_directory_path(Dir), strip_chat(Message1, Message2), with_mutex(chat_store, ( setup_call_cleanup( diff --git a/lib/swish/content_filter.pl b/lib/swish/content_filter.pl new file mode 100644 index 0000000..c234f34 --- /dev/null +++ b/lib/swish/content_filter.pl @@ -0,0 +1,242 @@ +/* Part of SWISH + + Author: Jan Wielemaker + E-mail: J.Wielemaker@cs.vu.nl + WWW: http://www.swi-prolog.org + Copyright (C): 2017, VU University Amsterdam + CWI Amsterdam + All rights reserved. + + Redistribution and use in source and binary forms, with or without + modification, are permitted provided that the following conditions + are met: + + 1. Redistributions of source code must retain the above copyright + notice, this list of conditions and the following disclaimer. + + 2. Redistributions in binary form must reproduce the above copyright + notice, this list of conditions and the following disclaimer in + the documentation and/or other materials provided with the + distribution. + + THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS + "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT + LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS + FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE + COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, + INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, + BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; + LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER + CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT + LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN + ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE + POSSIBILITY OF SUCH DAMAGE. +*/ + +:- module(content_filter, + [ eval_content/3 % +Text, -Wordcount, -Score + ]). +:- use_module(library(porter_stem)). +:- use_module(library(apply)). +:- use_module(library(debug)). + +/** <module> Ban list content filter + +@see https://www.cs.cmu.edu/~biglou/resources/bad-words.txt +@see https://www.freewebheaders.com/full-list-of-bad-words-banned-by-google/ +*/ + +:- dynamic + black/1, + white/1, + wl_loaded/0. + +%! eval_content(+Text, -WordCount, -Score) is det. +% +% Evaluate the content of Text. WordCount is the number of non-trivial +% words and Score is the evaluation. + +eval_content(Text, WordCount, Score) :- + read_word_lists, + tokenize_atom(Text, Tokens), + wordlist(Tokens, Words), + length(Words, WordCount), + foldl(acc_score, Words, 0-0, Score0-_Acc), + Score is max(-100, min(100, Score0)). + +acc_score(Word, V0-A0, V-A) :- + downcase_atom(Word, Lower), + score(Lower, WScore), + A is min(0, A0//2 + WScore), + V is V0+A+WScore, + debug(spam, '~w: ~w, ~w -> ~w', [Word, WScore, V0-A0, V-A]). + +score(Word, 20) :- + current_predicate(Word, system:_), + !. +score(Word, Score) :- + black(Word), + !, + Score = -50. +score(Word, Score) :- + white(Word), + !, + Score = 10. +score(Word, Score) :- + glued_identifier(Word), + !, + Score = 15. +score(_, -5). + +glued_identifier(Word) :- + id_part(Word), + !. +glued_identifier(Word) :- + atom_length(Word, Len), + Len > 25, + !, + fail. +glued_identifier(Word) :- + ( sub_atom(Word, _, _, _, '_'), + atomic_list_concat(Parts, '_', Word), + Parts = [_,_|_] + -> ! + ), + maplist(id_part, Parts). +glued_identifier(Word) :- + atom_concat(Pre, Rest, Word), + atom_length(Pre, L), + L > 2, + id_part(Pre), + glued_identifier(Rest), + !. +glued_identifier(Word) :- + ( atom_concat(Pre, Rest, Word), + atom_number(Rest, _) + -> id_part(Pre) + ). + +id_part(Part) :- + atom_length(Part, 1), + !. +id_part(Part) :- + atom_number(Part, _). +id_part(Part) :- + downcase_atom(Part, Word), + white(Word), + !, + \+ black(Word). + + +%! wordlist(+Tokens, -WordList) is det. +% +% Filter the token list. Removes numbers and joins typical escape +% patterns such as 't h i s' or 't.h.i.s'. + +wordlist([], []). +wordlist([H|T0], Words) :- + single_char(H), + !, + single_chars(T0, Chars, T), + ( make_word([H|Chars], Word) + -> Words = [Word|TW] + ; TW = Words + ), + wordlist(T, TW). +wordlist([H|T], Words) :- + number(H), + !, + wordlist(T, Words). +wordlist([H|T0], [H|T]) :- + wordlist(T0, T). + +single_chars([H|T0], [H|T], Rest) :- + single_char(H), + !, + single_chars(T0, T, Rest). +single_chars(List, [], List). + +single_char(H) :- + atom(H), + !, + atom_length(H, 1). +single_char(H) :- + integer(H), + between(0, 9, H). + +make_word(List, Word) :- + separated(List, _Sep, Chars), + wordy(Chars), + atomic_list_concat(Chars, Word). +make_word(List, Word) :- + wordy(List), + !, + atomic_list_concat(List, Word). + +separated([H,Sep|T0], Sep, [H|T]) :- + char_type(Sep, punct), + separated_([Sep|T0], Sep, T). + +separated_([], _, []). +separated_([Sep,H|T0], Sep, [H|T]) :- + separated_(T0, Sep, T). + +wordy(Chars) :- + wordy(Chars, 0, V), + V >= 3. + +wordy([H|T], V0, V) :- + char_type(H, alnum), + !, + V1 is V0 + 1, + wordy(T, V1, V). +wordy([_|T], V0, V) :- + V1 is V0 - 1, + wordy(T, V1, V). + + + /******************************* + * WORD LISTS * + *******************************/ + +read_word_lists :- + wl_loaded, + !. +read_word_lists :- + with_mutex(content_filter, read_word_lists_sync). + +read_word_lists_sync :- + wl_loaded, + !. +read_word_lists_sync :- + read_word_list(wordlist('words'), white), + read_word_list(wordlist('whitelist.txt'), white), + read_word_list(wordlist('bad-words.txt'), black), + read_word_list(wordlist('bad-words-google.txt'), black), + assertz(wl_loaded). + +:- multifile user:file_search_path/2. +user:file_search_path(wordlist, '/usr/share/dict'). +user:file_search_path(wordlist, Dir) :- + source_file(read_word_lists, SrcFile), + file_directory_name(SrcFile, Dir). + +%! read_word_list(+FileSpec, +List) is det. +% +% Read a list of words into a fact. + +read_word_list(File, List) :- + absolute_file_name(File, Path, [access(read), file_errors(fail)]), + !, + setup_call_cleanup( + open(Path, read, In, [encoding(utf8)]), + ( lazy_list(lazy_read_lines(In, [as(atom)]), Words), + forall(member(Word, Words), + assert_word(List, Word)) + ), + close(In)). +read_word_list(_, _). + +assert_word(black, Word0) :- downcase_atom(Word0, Word), assertz(black(Word)). +assert_word(white, Word0) :- downcase_atom(Word0, Word), assertz(white(Word)). + diff --git a/lib/swish/cron.pl b/lib/swish/cron.pl new file mode 100644 index 0000000..697ccd0 --- /dev/null +++ b/lib/swish/cron.pl @@ -0,0 +1,168 @@ +/* Part of SWISH + + Author: Jan Wielemaker + E-mail: J.Wielemaker@cs.vu.nl + WWW: http://www.swi-prolog.org + Copyright (C): 2017, VU University Amsterdam + CWI Amsterdam + All rights reserved. + + Redistribution and use in source and binary forms, with or without + modification, are permitted provided that the following conditions + are met: + + 1. Redistributions of source code must retain the above copyright + notice, this list of conditions and the following disclaimer. + + 2. Redistributions in binary form must reproduce the above copyright + notice, this list of conditions and the following disclaimer in + the documentation and/or other materials provided with the + distribution. + + THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS + "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT + LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS + FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE + COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, + INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, + BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; + LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER + CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT + LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN + ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE + POSSIBILITY OF SUCH DAMAGE. +*/ + +:- module(http_cron, + [ http_schedule_maintenance/2 % +When, :Goal + ]). +:- use_module(library(broadcast)). +:- use_module(library(error)). + +:- meta_predicate + http_schedule_maintenance(+, 0). + +:- dynamic + cron_schedule/2. % Schedule, Goal + +/** <module> Schedule HTTP server maintenance tasks + +This module deals with scheduling low frequency maintenance tasks to run +at specified time stamps. The jobs are scheduled on the wall clock and +thus the interval is kept over server restarts. +*/ + +%! http_schedule_maintenance(+When, :Goal) is det. +% +% Schedule running Goal based on maintenance broadcasts. When is one +% of: +% +% - daily(Hour:Min) +% Run each day at Hour:Min. Min is rounded to a multitude +% of 5. +% - weekly(Day, Hour:Min) +% Run at the given Day and Time each week. Day is either a +% number 1..7 (1 is Monday) or a weekday name or abbreviation. +% - monthly(DayOfTheMonth, Hour:Min) +% Run each month at the given Day (1..31). Note that not all +% months have all days. +% - clear +% Clear the schedule for the given goal. +% +% This must be used with a timer that broadcasts a +% maintenance(_,_) message (see broadcast/1). Such a timer is part +% of library(http/http_unix_daemon). +% +% @arg Goal is the goal called. This is executed in the thread that +% broadcasts the maintenance(_,_) event, i.e., by default in the +% `main` thread. If a considerable amount of work is to be done it is +% adviced to start a _detached_ thread to do the real work. + +http_schedule_maintenance(When, Goal) :- + listen(maintenance(_,_), http_consider_cronstart), + ( compile_schedule(When, Schedule) + -> clear_schedule(Goal), + ( Schedule == clear + -> true + ; asserta(cron_schedule(Schedule, Goal)) + ) + ; domain_error(schedule, When) + ). + +clear_schedule(Goal) :- + ( clause(cron_schedule(_, Goal0), true, Ref), + Goal =@= Goal0, + erase(Ref), + fail + ; true + ). + +compile_schedule(Var, _) :- + var(Var), + !, + instantiation_error(Var). +compile_schedule(clear, clear). +compile_schedule(daily(Time0), daily(Time)) :- + compile_time(Time0, Time). +compile_schedule(weekly(Day0, Time0), weekly(Day, Time)) :- + compile_weekday(Day0, Day), + compile_time(Time0, Time). +compile_schedule(monthly(Day, Time0), monthly(Day, Time)) :- + must_be(between(0, 31), Day), + compile_time(Time0, Time). + +compile_time(HH:MM0, HH:MM) :- + must_be(between(0, 23), HH), + must_be(between(0, 59), MM0), + MM is ((MM0+4)//5)*5. + +compile_weekday(N, _) :- + var(N), + !, + instantiation_error(N). +compile_weekday(N, N) :- + integer(N), + !, + must_be(between(1,7), N). +compile_weekday(Day, N) :- + downcase_atom(Day, Lwr), + ( sub_atom(Lwr, 0, 3, _, Abbr), + day(N, Abbr) + -> ! + ; domain_error(day, Day) + ). + +%! http_consider_cronstart +% +% Run scheduled tasks. + +http_consider_cronstart :- + get_time(NowF), + Now is round(NowF/60.0)*60, + ( cron_schedule(Schedule, Goal), + scheduled(Schedule, Now), + catch(Goal, E, print_message(warning, E)), + fail + ; true + ). + +scheduled(daily(HH:MM), Now) :- + stamp_date_time(Now, DateTime, local), + date_time_value(time, DateTime, time(HH,MM,_)). +scheduled(weekly(Day, Time), Now) :- + stamp_date_time(Now, DateTime, local), + date_time_value(date, DateTime, Date), + day_of_the_week(Date, Day), + scheduled(daily(Time), Now). +scheduled(monthly(Day, Time), Now) :- + stamp_date_time(Now, DateTime, local), + date_time_value(day, DateTime, Day), + scheduled(daily(Time), Now). + +day(1, mon). +day(2, tue). +day(3, wed). +day(4, thu). +day(5, fri). +day(6, sat). +day(7, sun). diff --git a/lib/swish/gitty.pl b/lib/swish/gitty.pl index 9c92790..9a66294 100644 --- a/lib/swish/gitty.pl +++ b/lib/swish/gitty.pl @@ -3,7 +3,7 @@ Author: Jan Wielemaker E-mail: J.Wielemaker@vu.nl WWW: http://www.swi-prolog.org - Copyright (c) 2014-2015, VU University Amsterdam + Copyright (c) 2014-2018, VU University Amsterdam All rights reserved. Redistribution and use in source and binary forms, with or without @@ -35,8 +35,10 @@ :- module(gitty, [ gitty_open/2, % +Store, +Options gitty_close/1, % +Store + gitty_driver/2, % +Store, -Driver gitty_file/3, % +Store, ?Name, ?Hash + gitty_file/4, % +Store, ?Name, ?Ext, ?Hash gitty_create/5, % +Store, +Name, +Data, +Meta, -Commit gitty_update/5, % +Store, +Name, +Data, +Meta, -Commit gitty_commit/3, % +Store, +Name, -Meta @@ -44,6 +46,10 @@ gitty_history/4, % +Store, +Name, -History, +Options gitty_hash/2, % +Store, ?Hash + gitty_fsck/1, % +Store + gitty_save/4, % +Store, +Data, +Type, -Hash + gitty_load/4, % +Store, +Hash, -Data, -Type + gitty_reserved_meta/1, % ?Key is_gitty_hash/1, % @Term @@ -140,6 +146,14 @@ store_driver_module(Store, Module) :- atom(Store), !, gitty_store_type(Store, Module). +%! gitty_driver(+Store, -Driver) +% +% Get the current gitty driver + +gitty_driver(Store, Driver) :- + store_driver_module(Store, Module), + driver_module(Driver, Module), !. + %% gitty_close(+Store) is det. % % Close access to the Store. @@ -148,14 +162,17 @@ gitty_close(Store) :- store_driver_module(Store, M), M:gitty_close(Store). -%% gitty_file(+Store, ?File, ?Head) is nondet. +%% gitty_file(+Store, ?Head, ?Hash) is nondet. +%% gitty_file(+Store, ?Head, ?Ext, ?Hash) is nondet. % -% True when File entry in the gitty store and Head is the HEAD -% revision. +% True when Hash is an entry in the gitty Store and Head is the +% HEAD revision. gitty_file(Store, Head, Hash) :- + gitty_file(Store, Head, _Ext, Hash). +gitty_file(Store, Head, Ext, Hash) :- store_driver_module(Store, M), - M:gitty_file(Store, Head, Hash). + M:gitty_file(Store, Head, Ext, Hash). %% gitty_create(+Store, +Name, +Data, +Meta, -Commit) is det. % @@ -373,13 +390,34 @@ size_in_bytes(Data, Size) :- close(Out)). +%! gitty_fsck(+Store) is det. +% +% Check the integrity of store. + +gitty_fsck(Store) :- + forall(gitty_hash(Store, Hash), + fsck_object_msg(Store, Hash)), + store_driver_module(Store, M), + M:gitty_fsck(Store). + +fsck_object_msg(Store, Hash) :- + fsck_object(Store, Hash), !. +fsck_object_msg(Store, Hash) :- + print_message(error, gitty(Store, fsck(bad_object(Hash)))). + %% fsck_object(+Store, +Hash) is semidet. % % Test the integrity of object Hash in Store. -:- public fsck_object/2. +:- public + fsck_object/2, + check_object/4. + fsck_object(Store, Hash) :- load_object(Store, Hash, Data, Type, Size), + check_object(Hash, Data, Type, Size). + +check_object(Hash, Data, Type, Size) :- format(string(Hdr), '~w ~d\u0000', [Type, Size]), sha_new_ctx(Ctx0, []), sha_hash_ctx(Ctx0, Hdr, Ctx1, _), @@ -387,6 +425,8 @@ fsck_object(Store, Hash) :- hash_atom(HashBin, Hash). + + %% load_object(+Store, +Hash, -Data) is det. %% load_object(+Store, +Hash, -Data, -Type, -Size) is det. % @@ -398,6 +438,20 @@ load_object(Store, Hash, Data, Type, Size) :- store_driver_module(Store, Module), Module:load_object(Store, Hash, Data, Type, Size). +%! gitty_save(+Store, +Data, +Type, -Hash) is det. +%! gitty_load(+Store, +Hash, -Data, -Type) is det. +% +% Low level objects store. These predicate allows for using the +% store as an arbitrary content store. +% +% @arg Data is a string +% @arg Type is an atom denoting the object type. + +gitty_save(Store, Data, Type, Hash) :- + save_object(Store, Data, Type, Hash). +gitty_load(Store, Hash, Data, Type) :- + load_object(Store, Hash, Data, Type, _Size). + %% gitty_hash(+Store, ?Hash) is nondet. % % True when Hash is an object in the store. @@ -538,19 +592,33 @@ meta_tag_set(_, []). :- if(true). +% Note that cleanup on possible errors is rather hard. The created tmp +% stream must be closed and the file must be deleted. We also close the +% file before running diff (necessary on Windows to avoid a sharing +% violation). Therefore reclaim_tmp_file/2 first uses close/2 to close +% if not already done and then deletes the file. + udiff_string(Data1, Data2, UDIFF) :- setup_call_cleanup( - save_string(Data1, File1), - setup_call_cleanup( - save_string(Data2, File2), - process_diff(File1, File2, UDIFF), - delete_file(File2)), - delete_file(File1)). - -save_string(String, File) :- - tmp_file_stream(utf8, File, TmpOut), - format(TmpOut, '~s', [String]), - close(TmpOut). + tmp_file_stream(utf8, File1, Tmp1), + ( save_string(Data1, Tmp1), + setup_call_cleanup( + tmp_file_stream(utf8, File2, Tmp2), + ( save_string(Data2, Tmp2), + process_diff(File1, File2, UDIFF) + ), + reclaim_tmp_file(File2, Tmp2)) + ), + reclaim_tmp_file(File1, Tmp1)). + +save_string(String, Stream) :- + call_cleanup( + format(Stream, '~s', [String]), + close(Stream)). + +reclaim_tmp_file(File, Stream) :- + close(Stream, [force(true)]), + delete_file(File). process_diff(File1, File2, String) :- setup_call_cleanup( diff --git a/lib/swish/gitty_driver_files.pl b/lib/swish/gitty_driver_files.pl index eac1d6d..42f7450 100644 --- a/lib/swish/gitty_driver_files.pl +++ b/lib/swish/gitty_driver_files.pl @@ -3,7 +3,8 @@ Author: Jan Wielemaker E-mail: J.Wielemaker@vu.nl WWW: http://www.swi-prolog.org - Copyright (c) 2015, VU University Amsterdam + Copyright (c) 2015-2017, VU University Amsterdam + CWI Amsterdam All rights reserved. Redistribution and use in source and binary forms, with or without @@ -34,7 +35,7 @@ :- module(gitty_driver_files, [ gitty_close/1, % +Store - gitty_file/3, % +Store, ?Name, ?Hash + gitty_file/4, % +Store, ?Name, ?Ext, ?Hash gitty_update_head/4, % +Store, +Name, +OldCommit, +NewCommit delete_head/2, % +Store, +Name @@ -45,14 +46,31 @@ gitty_hash/2, % +Store, ?Hash load_plain_commit/3, % +Store, +Hash, -Meta load_object/5, % +Store, +Hash, -Data, -Type, -Size + gitty_object_file/3, % +Store, +Hash, -File + + repack_objects/2, % +Store, +Options + pack_objects/6, % +Store, +Objs, +Packs, +PackDir, + % -File, +Opts + unpack_packs/1, % +Store + unpack_pack/2, % +Store, +PackFile + + attach_pack/2, % +Store, +PackFile + gitty_fsck/1, % +Store + fsck_pack/1, % +PackFile + load_object_from_pack/4, % +Hash, -Data, -Type, -Size gitty_rescan/1 % Store ]). +:- use_module(library(apply)). :- use_module(library(zlib)). :- use_module(library(filesex)). :- use_module(library(lists)). :- use_module(library(apply)). :- use_module(library(error)). +:- use_module(library(debug)). +:- use_module(library(zlib)). +:- use_module(library(hash_stream)). +:- use_module(library(option)). :- use_module(library(dcg/basics)). /** <module> Gitty plain files driver @@ -74,15 +92,22 @@ to rounding the small objects to disk allocation units. */ :- dynamic - head/3, % Store, Name, Hash - store/2, % Store, Updated - commit/3, % Store, Hash, Meta - heads_input_stream_cache/2. % Store, Stream + head/4, % Store, Name, Ext, Hash + store/2, % Store, Updated + commit/3, % Store, Hash, Meta + heads_input_stream_cache/2, % Store, Stream + pack_object/6, % Hash, Type, Size, Offset, Store,PackFile + attached_packs/1, % Store + attached_pack/2. % Store, PackFile + :- volatile - head/3, - store/2, - commit/3, - heads_input_stream_cache/2. + head/4, + store/2, + commit/3, + heads_input_stream_cache/2, + pack_object/6, + attached_packs/1, + attached_pack/2. % enable/disable syncing remote servers running on the same file store. % This facility requires shared access to files and thus doesn't work on @@ -94,31 +119,37 @@ remote_sync(false). remote_sync(true). :- endif. -%% gitty_close(+Store) is det. +%! gitty_close(+Store) is det. % -% Close resources associated with a store. +% Close resources associated with a store. gitty_close(Store) :- - ( retract(heads_input_stream_cache(Store, In)) - -> close(In) - ; true - ), - retractall(head(Store,_,_)), - retractall(store(Store,_)). + ( retract(heads_input_stream_cache(Store, In)) + -> close(In) + ; true + ), + retractall(head(Store,_,_,_)), + retractall(store(Store,_)), + retractall(pack_object(_,_,_,_,Store,_)). -%% gitty_file(+Store, ?File, ?Head) is nondet. +%% gitty_file(+Store, ?File, ?Ext, ?Head) is nondet. % % True when File entry in the gitty store and Head is the HEAD % revision. -gitty_file(Store, Head, Hash) :- +gitty_file(Store, Head, Ext, Hash) :- gitty_scan(Store), - head(Store, Head, Hash). + head(Store, Head, Ext, Hash). %% load_plain_commit(+Store, +Hash, -Meta:dict) is semidet. % -% Load the commit data as a dict. +% Load the commit data as a dict. Loaded commits are cached in +% commit/3. Note that only adding a fact to the cache is +% synchronized. This means that during a race situation we may +% load the same object multiple times from disk, but this is +% harmless while a lock around the whole predicate serializes +% loading different objects, which is not needed. load_plain_commit(Store, Hash, Meta) :- must_be(atom, Store), @@ -127,31 +158,41 @@ load_plain_commit(Store, Hash, Meta) :- load_plain_commit(Store, Hash, Meta) :- load_object(Store, Hash, String, _, _), term_string(Meta0, String, []), - assertz(commit(Store, Hash, Meta0)), + with_mutex(gitty_commit_cache, + assert_cached_commit(Store, Hash, Meta0)), Meta = Meta0. +assert_cached_commit(Store, Hash, Meta) :- + commit(Store, Hash, Meta0), !, + assertion(Meta0 =@= Meta). +assert_cached_commit(Store, Hash, Meta) :- + assertz(commit(Store, Hash, Meta)). + %% store_object(+Store, +Hash, +Header:string, +Data:string) is det. % % Store the actual object. The store must associate Hash with the % concatenation of Hdr and Data. +store_object(Store, Hash, _Hdr, _Data) :- + pack_object(Hash, _Type, _Size, _Offset, Store, _Pack), !. store_object(Store, Hash, Hdr, Data) :- - sub_atom(Hash, 0, 2, _, Dir0), - sub_atom(Hash, 2, 2, _, Dir1), - sub_atom(Hash, 4, _, 0, File), - directory_file_path(Store, Dir0, D0), - ensure_directory(D0), - directory_file_path(D0, Dir1, D1), - ensure_directory(D1), - directory_file_path(D1, File, Path), - ( exists_file(Path) + gitty_object_file(Store, Hash, Path), + with_mutex(gitty_file, exists_or_create(Path, Out0)), + ( var(Out0) -> true ; setup_call_cleanup( - gzopen(Path, write, Out, [encoding(utf8)]), + zopen(Out0, Out, [format(gzip)]), format(Out, '~s~s', [Hdr, Data]), close(Out)) ). +exists_or_create(Path, _Out) :- + exists_file(Path), !. +exists_or_create(Path, Out) :- + file_directory_name(Path, Dir), + make_directory_path(Dir), + open(Path, write, Out, [encoding(utf8), lock(write)]). + ensure_directory(Dir) :- exists_directory(Dir), !. ensure_directory(Dir) :- @@ -161,19 +202,37 @@ ensure_directory(Dir) :- % % Load the given object. +load_object(_Store, Hash, Data, Type, Size) :- + load_object_from_pack(Hash, Data0, Type0, Size0), !, + f(Data0, Type0, Size0) = f(Data, Type, Size). load_object(Store, Hash, Data, Type, Size) :- - hash_file(Store, Hash, Path), + gitty_object_file(Store, Hash, Path), + exists_file(Path), setup_call_cleanup( gzopen(Path, read, In, [encoding(utf8)]), read_object(In, Data, Type, Size), close(In)). +%! load_object_header(+Store, +Hash, -Type, -Size) is det. +% +% Load the header of an object + +load_object_header(Store, Hash, Type, Size) :- + gitty_object_file(Store, Hash, Path), + setup_call_cleanup( + gzopen(Path, read, In, [encoding(utf8)]), + read_object_hdr(In, Type, Size), + close(In)). + read_object(In, Data, Type, Size) :- + read_object_hdr(In, Type, Size), + read_string(In, _, Data). + +read_object_hdr(In, Type, Size) :- get_code(In, C0), read_hdr(C0, In, Hdr), phrase((nonblanks(TypeChars), " ", integer(Size)), Hdr), - atom_codes(Type, TypeChars), - read_string(In, _, Data). + atom_codes(Type, TypeChars). read_hdr(C, In, [C|T]) :- C > 0, !, @@ -211,29 +270,36 @@ gitty_scan_sync(Store) :- store(Store, _), !. gitty_scan_sync(Store) :- remote_sync(true), !, + gitty_attach_packs(Store), restore_heads_from_remote(Store). gitty_scan_sync(Store) :- + gitty_attach_packs(Store), read_heads_from_objects(Store). %% read_heads_from_objects(+Store) is det. % -% Establish the head(Store,File,Hash) relation by reading all -% objects and adding a fact for the most recent commit. +% Establish the head(Store,File,Ext,Hash) relation by reading all +% objects and adding a fact for the most recent commit. read_heads_from_objects(Store) :- gitty_scan_latest(Store), forall(retract(latest(Name, Hash, _Time)), - assert(head(Store, Name, Hash))), + assert_head(Store, Name, Hash)), get_time(Now), assertz(store(Store, Now)). +assert_head(Store, Name, Hash) :- + file_name_extension(_, Ext, Name), + assertz(head(Store, Name, Ext, Hash)). + + %% gitty_scan_latest(+Store) % % Scans the gitty store, extracting the latest version of each % named entry. gitty_scan_latest(Store) :- - retractall(head(Store, _, _)), + retractall(head(Store, _, _, _)), retractall(latest(_, _, _)), ( gitty_hash(Store, Hash), load_object(Store, Hash, Data, commit, _Size), @@ -256,6 +322,19 @@ gitty_scan_latest(Store) :- gitty_hash(Store, Hash) :- var(Hash), !, + ( gitty_attach_packs(Store), + pack_object(Hash, _Type, _Size, _Offset, Store, _Pack) + ; gitty_file_object(Store, Hash) + ). +gitty_hash(Store, Hash) :- + ( gitty_attach_packs(Store), + pack_object(Hash, _Type, _Size, _Offset, Store, _Pack) + -> true + ; gitty_object_file(Store, Hash, File), + exists_file(File) + ). + +gitty_file_object(Store, Hash) :- access_file(Store, exist), directory_files(Store, Level0), member(E0, Level0), @@ -271,22 +350,24 @@ gitty_hash(Store, Hash) :- member(File, Files), atom_length(File, 36), atomic_list_concat([E0,E1,File], Hash). -gitty_hash(Store, Hash) :- - hash_file(Store, Hash, File), - exists_file(File). %% delete_object(+Store, +Hash) % % Delete an existing object delete_object(Store, Hash) :- - hash_file(Store, Hash, File), + gitty_object_file(Store, Hash, File), delete_file(File). -hash_file(Store, Hash, Path) :- - sub_atom(Hash, 0, 2, _, Dir0), - sub_atom(Hash, 2, 2, _, Dir1), - sub_atom(Hash, 4, _, 0, File), +%! gitty_object_file(+Store, +Hash, -Path) is det. +% +% True when Path is the file at which the object with Hash is +% stored. + +gitty_object_file(Store, Hash, Path) :- + sub_string(Hash, 0, 2, _, Dir0), + sub_string(Hash, 2, 2, _, Dir1), + sub_string(Hash, 4, _, 0, File), atomic_list_concat([Store, Dir0, Dir1, File], /, Path). @@ -323,12 +404,12 @@ gitty_update_head_sync(Store, Name, OldCommit, NewCommit, HeadsOut) :- gitty_update_head_sync2(Store, Name, OldCommit, NewCommit) :- gitty_scan(Store), % fetch remote changes ( OldCommit == (-) - -> ( head(Store, Name, _) + -> ( head(Store, Name, _, _) -> throw(error(gitty(file_exists(Name),_))) - ; assertz(head(Store, Name, NewCommit)) + ; assert_head(Store, Name, NewCommit) ) - ; ( retract(head(Store, Name, OldCommit)) - -> assertz(head(Store, Name, NewCommit)) + ; ( retract(head(Store, Name, _, OldCommit)) + -> assert_head(Store, Name, NewCommit) ; throw(error(gitty(not_at_head(Name, OldCommit)), _)) ) ). @@ -367,10 +448,10 @@ remote_update(Store) :- update_head(Store, head(Name, OldCommit, NewCommit)) :- ( OldCommit == (-) - -> \+ head(Store, Name, _) - ; retract(head(Store, Name, OldCommit)) + -> \+ head(Store, Name, _, _) + ; retract(head(Store, Name, _, OldCommit)) ), !, - assert(head(Store, Name, NewCommit)). + assert_head(Store, Name, NewCommit). update_head(_, _). %% remote_updates(+Store, -List) is det. @@ -460,15 +541,15 @@ restore_heads(Store, In) :- restore_heads(end_of_file, _, _) :- !. restore_heads(head(File, _, Hash), In, Store) :- - retractall(head(Store, File, _)), - assertz(head(Store, File, Hash)), + retractall(head(Store, File, _, _)), + assert_head(Store, File, Hash), read(In, Term), restore_heads(Term, In, Store). save_heads(Store, Out) :- get_time(Now), format(Out, 'epoch(~0f).~n~n', [Now]), - forall(head(Store, File, Hash), + forall(head(Store, File, _, Hash), format(Out, '~q.~n', [head(File, -, Hash)])). @@ -479,12 +560,465 @@ save_heads(Store, Out) :- % should they do their own thing? delete_head(Store, Head) :- - retractall(head(Store, Head, _)). + retractall(head(Store, Head, _, _)). %% set_head(+Store, +File, +Hash) is det. % % Set the head of the given File to Hash set_head(Store, File, Hash) :- - retractall(head(Store, File, _)), - asserta(head(Store, File, Hash)). + file_name_extension(_, Ext, File), + ( head(Store, File, _, Hash0) + -> ( Hash == Hash0 + -> true + ; asserta(head(Store, File, Ext, Hash)), + retractall(head(Store, File, _, Hash0)) + ) + ; asserta(head(Store, File, Ext, Hash)) + ). + + + /******************************* + * PACKS * + *******************************/ + +/* - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - + +<pack file> := <header> + <file>* +<header> := "gitty(Version).\n" <object>* "end_of_header.\n" +<object> := obj(Hash, Type, Size, FileSize) +- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - */ + +pack_version(1). + +%! repack_objects(+Store, +Options) is det. +% +% Repack objects of Store for reduced disk usage and enhanced +% performance. By default this picks up all file objects of the store +% and all existing small pack files. Options: +% +% - small_pack(+Bytes) +% Consider all packs with less than Bytes as small and repack them. +% Default 10Mb +% - min_files(+Count) +% Do not repack if there are less than Count new files. +% Default 1,000. + +:- debug(gitty(pack)). + +repack_objects(Store, Options) :- + option(min_files(MinFiles), Options, 1_000), + findall(Object, gitty_file_object(Store, Object), Objects), + length(Objects, NewFiles), + debug(gitty(pack), 'Found ~D file objects', [NewFiles]), + ( NewFiles >= MinFiles + -> pack_files(Store, ExistingPacks), + option(small_pack(MaxSize), Options, 10_000_000), + include(small_file(MaxSize), ExistingPacks, PackFiles), + ( debugging(gitty(pack)) + -> length(PackFiles, PackCount), + debug(gitty(pack), 'Found ~D small packs', [PackCount]) + ; true + ), + directory_file_path(Store, pack, PackDir), + make_directory_path(PackDir), + pack_objects(Store, Objects, PackFiles, PackDir, _PackFile, Options) + ; debug(gitty(pack), 'Nothing to do', []) + ). + +small_file(MaxSize, File) :- + size_file(File, Size), + Size < MaxSize. + +%! pack_objects(+Store, +Objects, +Packs, +PackDir, +%! -PackFile, +Options) is det. +% +% Pack the given objects and pack files into a new pack. + +pack_objects(Store, Objects, Packs, PackDir, PackFile, Options) :- + with_mutex(gitty_pack, + pack_objects_sync(Store, Objects, Packs, PackDir, + PackFile, Options)). + +pack_objects_sync(_Store, [], [Pack], _, [Pack], _) :- + !. +pack_objects_sync(Store, Objects, Packs, PackDir, PackFilePath, Options) :- + length(Objects, ObjCount), + length(Packs, PackCount), + debug(gitty(pack), 'Repacking ~D objects and ~D packs', + [ObjCount, PackCount]), + maplist(object_info(Store), Objects, FileInfo), + maplist(pack_info(Store), Packs, PackInfo), + append([FileInfo|PackInfo], Info0), + sort(1, @<, Info0, Info), % remove possible duplicates + ( debugging(gitty(pack)) + -> ( PackCount > 0 + -> length(Info, FinalObjCount), + debug(gitty(pack), 'Total ~D objects', [FinalObjCount]) + ; true + ) + ; true + ), + directory_file_path(PackDir, 'pack-create', TmpPack), + setup_call_cleanup( + ( open(TmpPack, write, Out0, [type(binary), lock(write)]), + open_hash_stream(Out0, Out, [algorithm(sha1)]) + ), + ( write_signature(Out), + maplist(write_header(Out), Info), + format(Out, 'end_of_header.~n', []), + maplist(add_file(Out, Store), Info), + stream_hash(Out, SHA1) + ), + close(Out)), + format(atom(PackFile), 'pack-~w.pack', [SHA1]), + directory_file_path(PackDir, PackFile, PackFilePath), + rename_file(TmpPack, PackFilePath), + debug(gitty(pack), 'Attaching ~p', [PackFilePath]), + attach_pack(Store, PackFilePath), + remove_objects_after_pack(Store, Objects, Options), + delete(Packs, PackFilePath, RmPacks), + remove_repacked_packs(Store, RmPacks, Options), + debug(gitty(pack), 'Packing completed', []). + +object_info(Store, Object, obj(Object, Type, Size, FileSize)) :- + gitty_object_file(Store, Object, File), + load_object_header(Store, Object, Type, Size), + size_file(File, FileSize). + +pack_info(Store, PackFile, Objects) :- + attach_pack(Store, PackFile), + pack_read_header(PackFile, _Version, _DataOffset, Objects). + +write_signature(Out) :- + pack_version(Version), + format(Out, "gitty(~d).~n", [Version]). + +write_header(Out, obj(Object, Type, Size, FileSize)) :- + format(Out, 'obj(~q,~q,~d,~d).~n', [Object, Type, Size, FileSize]). + +%! add_file(+Out, +Store, +Object) is det. +% +% Add Object from Store to the pack stream Out. + +add_file(Out, Store, obj(Object, _Type, _Size, _FileSize)) :- + gitty_object_file(Store, Object, File), + exists_file(File), + !, + setup_call_cleanup( + open(File, read, In, [type(binary)]), + copy_stream_data(In, Out), + close(In)). +add_file(Out, Store, obj(Object, Type, Size, FileSize)) :- + pack_object(Object, Type, Size, Offset, Store, PackFile), + setup_call_cleanup( + open(PackFile, read, In, [type(binary)]), + ( seek(In, Offset, bof, Offset), + copy_stream_data(In, Out, FileSize) + ), + close(In)). + + +%! gitty_fsck(+Store) is det. +% +% Validate all packs associated with Store + +gitty_fsck(Store) :- + pack_files(Store, PackFiles), + maplist(fsck_pack, PackFiles). + +%! fsck_pack(+File) is det. +% +% Validate the integrity of the pack file File. + +fsck_pack(File) :- + debug(gitty(pack), 'fsck ~p', [File]), + check_pack_hash(File), + debug(gitty(pack), 'fsck ~p: checking objects', [File]), + check_pack_objects(File), + debug(gitty(pack), 'fsck ~p: done', [File]). + +check_pack_hash(File) :- + file_base_name(File, Base), + file_name_extension(Plain, Ext, Base), + must_be(oneof([pack]), Ext), + atom_concat('pack-', Hash, Plain), + setup_call_cleanup( + ( open(File, read, In0, [type(binary)]), + open_hash_stream(In0, In, [algorithm(sha1)]) + ), + ( setup_call_cleanup( + open_null_stream(Null), + copy_stream_data(In, Null), + close(Null)), + stream_hash(In, SHA1) + ), + close(In)), + assertion(Hash == SHA1). + +check_pack_objects(PackFile) :- + setup_call_cleanup( + open(PackFile, read, In, [type(binary)]), + ( read_header(In, Version, DataOffset, Objects), + set_stream(In, encoding(utf8)), + foldl(check_object(In, PackFile, Version), Objects, DataOffset, _) + ), + close(In)). + +check_object(In, PackFile, _Version, + obj(Object, Type, Size, FileSize), + Offset0, Offset) :- + Offset is Offset0+FileSize, + byte_count(In, Here), + ( Here == Offset0 + -> true + ; print_message(warning, pack(reposition(Here, Offset0))), + seek(In, Offset0, bof, Offset0) + ), + ( setup_call_cleanup( + zopen(In, In2, [multi_part(false), close_parent(false)]), + catch(read_object(In2, Data, _0RType, _0RSize), E, + ( print_message(error, + gitty(PackFile, fsck(read_object(Object, E)))), + fail)), + close(In2)) + -> byte_count(In, End), + ( End == Offset + -> true + ; print_message(error, + gitty(PackFile, fsck(object_end(Object, End, + Offset0, Offset, + Data)))) + ), + assertion(Type == _0RType), + assertion(Size == _0RSize), + gitty:check_object(Object, Data, Type, Size) + ; true + ). + + +%! gitty_attach_packs(+Store) is det. +% +% Attach all packs for Store + +gitty_attach_packs(Store) :- + attached_packs(Store), + !. +gitty_attach_packs(Store) :- + with_mutex(gitty_attach_packs, + gitty_attach_packs_sync(Store)). + +gitty_attach_packs_sync(Store) :- + attached_packs(Store), + !. +gitty_attach_packs_sync(Store) :- + pack_files(Store, PackFiles), + maplist(attach_pack(Store), PackFiles), + asserta(attached_packs(Store)). + +pack_files(Store, Packs) :- + directory_file_path(Store, pack, PackDir), + exists_directory(PackDir), + !, + directory_files(PackDir, Files), + convlist(is_pack(PackDir), Files, Packs). +pack_files(_, []). + +is_pack(PackDir, File, Path) :- + file_name_extension(_, pack, File), + directory_file_path(PackDir, File, Path). + +%! attach_pack(+Store, +PackFile) +% +% Load the index of Pack into memory. + +attach_pack(Store, PackFile) :- + attached_pack(Store, PackFile), + !. +attach_pack(Store, PackFile) :- + retractall(pack_object(_,_,_,_,_,PackFile)), + pack_read_header(PackFile, Version, DataOffset, Objects), + foldl(assert_object(Store, PackFile, Version), Objects, DataOffset, _), + assertz(attached_pack(Store, PackFile)). + +pack_read_header(PackFile, Version, DataOffset, Objects) :- + setup_call_cleanup( + open(PackFile, read, In, [type(binary)]), + read_header(In, Version, DataOffset, Objects), + close(In)). + +read_header(In, Version, DataOffset, Objects) :- + read(In, Signature), + ( Signature = gitty(Version) + -> true + ; domain_error(gitty_pack_file, Objects) + ), + read(In, Term), + read_index(Term, In, Objects), + get_code(In, Code), + assertion(Code == 0'\n), + byte_count(In, DataOffset). + +read_index(end_of_header, _, []) :- + !. +read_index(Object, In, [Object|T]) :- + read(In, Term2), + read_index(Term2, In, T). + +assert_object(Store, Pack, _Version, + obj(Object, Type, Size, FileSize), + Offset0, Offset) :- + Offset is Offset0+FileSize, + assertz(pack_object(Object, Type, Size, Offset0, Store, Pack)). + +%! detach_pack(+Store, +Pack) is det. +% +% Remove a pack file from the memory index. + +detach_pack(Store, Pack) :- + retractall(pack_object(_, _, _, _, Store, Pack)), + retractall(attached_pack(Store, Pack)). + +%! load_object_from_pack(+Hash, -Data, -Type, -Size) is semidet. +% +% True when Hash is in a pack and can be loaded. + +load_object_from_pack(Hash, Data, Type, Size) :- + pack_object(Hash, Type, Size, Offset, _Store, Pack), + setup_call_cleanup( + open(Pack, read, In, [type(binary)]), + read_object_at(In, Offset, Data, Type, Size), + close(In)). + +read_object_at(In, Offset, Data, Type, Size) :- + seek(In, Offset, bof, Offset), + read_object_here(In, Data, Type, Size). + +read_object_here(In, Data, Type, Size) :- + stream_property(In, encoding(Enc)), + setup_call_cleanup( + ( set_stream(In, encoding(utf8)), + zopen(In, In2, [multi_part(false), close_parent(false)]) + ), + read_object(In2, Data, Type, Size), + ( close(In2), + set_stream(In, encoding(Enc)) + )). + +%! unpack_packs(+Store) is det. +% +% Unpack all packs. + +unpack_packs(Store) :- + absolute_file_name(Store, AbsStore, [file_type(directory), + access(read)]), + pack_files(AbsStore, Packs), + maplist(unpack_pack(AbsStore), Packs). + +%! unpack_pack(+Store, +Pack) is det. +% +% Turn a pack back into a plain object files + +unpack_pack(Store, PackFile) :- + pack_read_header(PackFile, _Version, DataOffset, Objects), + setup_call_cleanup( + open(PackFile, read, In, [type(binary)]), + foldl(create_file(Store, In), Objects, DataOffset, _), + close(In)), + detach_pack(Store, PackFile), + delete_file(PackFile). + +create_file(Store, In, obj(Object, _Type, _Size, FileSize), Offset0, Offset) :- + Offset is Offset0+FileSize, + gitty_object_file(Store, Object, Path), + with_mutex(gitty_file, exists_or_recreate(Path, Out)), + ( var(Out) + -> true + ; setup_call_cleanup( + seek(In, Offset0, bof, Offset0), + copy_stream_data(In, Out, FileSize), + close(Out)) + ). + +exists_or_recreate(Path, _Out) :- + exists_file(Path), !. +exists_or_recreate(Path, Out) :- + file_directory_name(Path, Dir), + make_directory_path(Dir), + open(Path, write, Out, [type(binary), lock(write)]). + + +%! remove_objects_after_pack(+Store, +Objects, +Options) is det. +% +% Remove the indicated (file) objects from Store. + +remove_objects_after_pack(Store, Objects, Options) :- + debug(gitty(pack), 'Deleting plain files', []), + maplist(delete_object(Store), Objects), + ( option(prune_empty_directories(true), Options, true) + -> debug(gitty(pack), 'Pruning empty directories', []), + prune_empty_directories(Store) + ; true + ). + +%! remove_repacked_packs(+Store, +Packs, +Options) +% +% Remove packs that have been repacked. + +remove_repacked_packs(Store, Packs, Options) :- + maplist(remove_pack(Store, Options), Packs). + +remove_pack(Store, _Options, Pack) :- + detach_pack(Store, Pack), + delete_file(Pack). + +%! prune_empty_directories(+Dir) is det. +% +% Prune directories that are empty below Dir. Dir itself is not +% removed, even if it is empty. + +prune_empty_directories(Dir) :- + prune_empty_directories(Dir, 0). + +prune_empty_directories(Dir, Level) :- + directory_files(Dir, AllFiles), + exclude(hidden, AllFiles, Files), + ( Files == [], + Level > 0 + -> delete_directory_async(Dir) + ; convlist(prune_empty_directories(Dir, Level), Files, Left), + ( Left == [], + Level > 0 + -> delete_directory_async(Dir) + ; true + ) + ). + +hidden(.). +hidden(..). + +prune_empty_directories(Parent, Level0, File, _) :- + directory_file_path(Parent, File, Path), + exists_directory(Path), + !, + Level is Level0 + 1, + prune_empty_directories(Path, Level), + fail. +prune_empty_directories(_, _, File, File). + +delete_directory_async(Dir) :- + with_mutex(gitty_file, delete_directory_async2(Dir)). + +delete_directory_async2(Dir) :- + catch(delete_directory(Dir), E, + ( \+ exists_directory(Dir) + -> true + ; \+ empty_directory(Dir) + -> true + ; throw(E) + )). + +empty_directory(Dir) :- + directory_files(Dir, AllFiles), + exclude(hidden, AllFiles, []). diff --git a/lib/swish/highlight.pl b/lib/swish/highlight.pl index 38f3f94..56cd9e9 100644 --- a/lib/swish/highlight.pl +++ b/lib/swish/highlight.pl @@ -3,7 +3,7 @@ Author: Jan Wielemaker E-mail: J.Wielemaker@vu.nl WWW: http://www.swi-prolog.org - Copyright (c) 2014-2016, VU University Amsterdam + Copyright (c) 2014-2017, VU University Amsterdam All rights reserved. Redistribution and use in source and binary forms, with or without @@ -789,7 +789,11 @@ style(neck(Neck), neck, [ text(Text) ]) :- style(head(Class, Head), Type, [ text, arity(Arity) ]) :- goal_arity(Head, Arity), head_type(Class, Type). +style(goal_term(Class, {_}), brace_term_open-brace_term_close, + [ name({}), arity(1) | More ]) :- + goal_type(Class, _Type, More). style(goal(Class, Goal), Type, [ text, arity(Arity) | More ]) :- + Goal \= {_}, goal_arity(Goal, Arity), goal_type(Class, Type, More). style(file_no_depend(Path), file_no_depends, [text, path(Path)]). @@ -814,6 +818,7 @@ style(delimiter, delimiter, [text]). style(identifier, identifier, [text]). style(module(_Module), module, [text]). style(error, error, [text]). +style(constraint(Set), constraint, [text, set(Set)]). style(type_error(Expect), error, [text,expected(Expect)]). style(syntax_error(_Msg,_Pos), syntax_error, []). style(instantiation_error, instantiation_error, [text]). @@ -870,6 +875,7 @@ neck_text(directive, (:-)). head_type(exported, head_exported). head_type(public(_), head_public). head_type(extern(_), head_extern). +head_type(extern(_,_), head_extern). head_type(dynamic, head_dynamic). head_type(multifile, head_multifile). head_type(unreferenced, head_unreferenced). @@ -893,6 +899,7 @@ goal_type(dynamic(Line), goal_dynamic, [line(Line)]). goal_type(multifile(Line), goal_multifile, [line(Line)]). goal_type(expanded, goal_expanded, []). goal_type(extern(_), goal_extern, []). +goal_type(extern(_,_), goal_extern, []). goal_type(recursion, goal_recursion, []). goal_type(meta, goal_meta, []). goal_type(foreign(_), goal_foreign, []). @@ -966,15 +973,18 @@ css_style(Style, Style). % True if RGB is the color for the named X11 color. x11_color(Name, R, G, B) :- - ( x11_color_cache(_,_,_,_) + ( x11_colors_done -> true - ; load_x11_colours + ; with_mutex(swish_highlight, load_x11_colours) ), x11_color_cache(Name, R, G, B). :- dynamic - x11_color_cache/4. + x11_color_cache/4, + x11_colors_done/0. +load_x11_colours :- + x11_colors_done, !. load_x11_colours :- source_file(load_x11_colours, File), file_directory_name(File, Dir), @@ -984,7 +994,8 @@ load_x11_colours :- ( lazy_list(lazy_read_lines(In, [as(string)]), List), maplist(assert_colour, List) ), - close(In)). + close(In)), + asserta(x11_colors_done). assert_colour(String) :- split_string(String, "\s\t\r", "\s\t\r", [RS,GS,BS|NameParts]), @@ -995,6 +1006,8 @@ assert_colour(String) :- downcase_atom(Name0, Name), assertz(x11_color_cache(Name, R, G, B)). +:- catch(initialization(load_x11_colours, prepare_state), _, true). + %% css(?Context, ?Selector, -Style) is nondet. % % Multifile hook to define additional style to apply in a specific diff --git a/lib/swish/include.pl b/lib/swish/include.pl index c4eab89..71c08a7 100644 --- a/lib/swish/include.pl +++ b/lib/swish/include.pl @@ -33,13 +33,13 @@ */ :- module(swish_include, - [ + [ include/2 % +File, +Options ]). :- use_module(storage). :- use_module(config). :- use_module(library(sandbox), []). :- use_module(library(debug)). -:- use_module(library(settings)). +:- use_module(library(option)). :- use_module(library(filesex)). :- use_module(library(error)). :- use_module(library(readutil)). @@ -58,35 +58,68 @@ shared gitty store. It realises this using the following steps: We allow for hierarchical and circular includes. */ +%! include(+File, +Options) +% +% Include file at a specific version. Supported options: +% +% - version(Version) +% Include version Version of File, where Version is a gitty +% commit of the file. This is the same as `:- include(Version).`, +% but more explicit. +% +% If the same file is included at two places it is included at most +% once. Additionally +% +% - If neither is versioned the most recent version is included. +% - If two versions resolve to the same content hash, this is +% included. +% - If a specific version is included, subsequent unspecified +% includes are ignored. A subsequent incompatibly versioned +% include results in an error. +% +% The envisioned model is that we can specify which version is, +% possibly indirectly, included by using directives like this: +% +% == +% :- include(File, [version(Hash)]). +% == + +include(File, Version) :- + throw(error(context_error(nodirective, include(File, Version)), _)). swish:term_expansion(:- include(FileIn), Expansion) :- - include_file_id(FileIn, File), + swish:term_expansion(:- include(FileIn, []), Expansion). +swish:term_expansion(:- include(FileIn, Options), Expansion) :- + setup_call_cleanup( + '$push_input_context'(swish_include), + expand_include(FileIn, Options, Expansion), + '$pop_input_context'). + +expand_include(FileIn, Options, Expansion) :- + include_file_id(FileIn, File, Options), + arg(2, File, IncludeID), ( prolog_load_context(module, Module), - clause(Module:'swish included'(File), true) + clause(Module:'swish included'(IncludeID), true) -> Expansion = [] ; Expansion = [ (:- discontiguous('swish included'/1)), - 'swish included'(File), + 'swish included'(IncludeID), (:- include(stream(URI, Stream, [close(true)]))) ], - '$push_input_context'(swish_include), include_data(File, URI, Data), - open_string(Data, Stream), - '$pop_input_context' + open_string(Data, Stream) ). -%! include_data(+FileSpec, -URI, -Data) +%! include_data(+FileID, -URI, -Data) % % Fetch the data to be included and obtain the URI for it. -include_data(Name, URI, Data) :- % Deal with gitty files - atom(Name), +include_data(file(Name, _Data, gitty(Meta)), URI, Data) :- !, - add_extension(Name, FileExt), - catch(storage_file(FileExt, Data, _Meta), + catch(storage_file(Meta.commit, Data, _Meta), error(existence_error(_,_),_), fail), - atom_concat('swish://', FileExt, URI). -include_data(Spec, URI, Data) :- + atom_concat('swish://', Name, URI). +include_data(file(Spec, Spec, filesystem), URI, Data) :- absolute_file_name(Spec, Path, [ file_type(prolog), access(read) ]), read_file_to_string(Path, Data, []), Spec =.. [Alias,_], @@ -94,15 +127,28 @@ include_data(Spec, URI, Data) :- format(atom(URI), 'swish://~w/~w', [Alias, NameExt]). -%! include_file_id(+FileIn, -File) is det. +%! include_file_id(+FileIn, -FileID, +Options) is det. % % Normalise an include file identifier and verify its safeness. -include_file_id(FileIn, File) :- +include_file_id(FileIn, file(File, IncludeID, gitty(Meta)), Options) :- atomic(FileIn), !, - atom_string(File, FileIn). -include_file_id(FileIn, File) :- + atom_string(File0, FileIn), + add_extension(File0, File), + ( option(version(Version), Options) + -> storage_meta_data(Version, Meta) + ; storage_meta_data(File, Meta) + ), + atom_concat('swish://', Meta.name, URI), + IncludeID0 = gitty(Meta.commit, Meta.data, URI), + ( prolog_load_context(module, Module), + clause(Module:'swish included'(IncludeIDPrev), true), + compatible_versions(IncludeIDPrev, IncludeID0, Version) + -> IncludeID = IncludeIDPrev + ; IncludeID = IncludeID0 + ). +include_file_id(FileIn, file(File, File, filesystem), _) :- compound(FileIn), FileIn =.. [Alias,NameIn], atom_string(Name, NameIn), @@ -113,6 +159,16 @@ include_file_id(FileIn, File) :- ), File =.. [Alias,Name]. +compatible_versions(Version, Version, _) :- !. +compatible_versions(gitty(_, DataHash, _), gitty(_, DataHash, _), _) :- !. +compatible_versions(Gitty1, Gitty2, Version) :- !, + Gitty1 = gitty(_, _, URI), + Gitty2 = gitty(_, _, URI), + ( var(Version) + -> true + ; throw(error(version_error(Gitty1, Gitty2), _)) + ). + safe_name(Name) :- \+ ( sub_atom(Name, 0, _, _, '../') ; sub_atom(Name, _, _, _, '/../') @@ -172,27 +228,38 @@ sandbox:safe_directive(M:include(stream(Id, Stream, [close(true)]))) :- :- multifile prolog_colour:term_colours/2. +prolog_colour:term_colours((:- include(FileIn, Options)), + neck(directive) - + [ goal(built_in,include(FileIn)) - + [ FileClass, + classify + ] + ]) :- + classify_include(FileIn, FileClass, Options). prolog_colour:term_colours((:- include(FileIn)), neck(directive) - [ goal(built_in,include(FileIn)) - [ FileClass ] ]) :- + classify_include(FileIn, FileClass, []). + +classify_include(FileIn, FileClass, Options) :- debug(include, 'Classifying ~p', [FileIn]), - ( catch(include_file_id(FileIn, File), _, fail) - -> classify_include(File, FileClass) + ( catch(include_file_id(FileIn, FileID, Options), _, fail) + -> classify_include(FileID, FileClass) ; FileClass = nofile ), debug(include, 'Class ~p', [FileClass]). -classify_include(File, FileClass) :- - atom(File), +classify_include(file(Name, _DataHash, gitty(Meta)), FileClass) :- !, - add_extension(File, FileExt), - catch(storage_meta_data(FileExt, _Meta), _, fail), - atom_concat('swish://', FileExt, Id), + ( is_hash(Name) + -> format(atom(Id), 'swish://~w@~w', [Meta.name, Name]) + ; atom_concat('swish://', Name, Id) + ), FileClass = file(Id). -classify_include(Spec, FileClass) :- +classify_include(file(Spec, Spec, filesystem), FileClass) :- absolute_file_name(Spec, Path, [ file_type(prolog), access(read) ]), Spec =.. [Alias,_], file_base_name(Path, NameExt), diff --git a/lib/swish/jquery.pl b/lib/swish/jquery.pl index f9bf337..20d8b6f 100644 --- a/lib/swish/jquery.pl +++ b/lib/swish/jquery.pl @@ -3,7 +3,7 @@ Author: Jan Wielemaker E-mail: J.Wielemaker@vu.nl WWW: http://www.swi-prolog.org - Copyright (c) 2015, VU University Amsterdam + Copyright (c) 2018, VU University Amsterdam All rights reserved. Redistribution and use in source and binary forms, with or without @@ -97,3 +97,8 @@ root_selector(this) :- !. root_selector(swish) :- !. root_selector(Selector) :- domain_error(root_selector, Selector). + +:- multifile + sandbox:safe_primitive/1. + +sandbox:safe_primitive(swish_jquery:jquery(_,_,_)). diff --git a/lib/swish/logging.pl b/lib/swish/logging.pl index d2521e4..46f894a 100644 --- a/lib/swish/logging.pl +++ b/lib/swish/logging.pl @@ -3,7 +3,7 @@ Author: Jan Wielemaker E-mail: J.Wielemaker@vu.nl WWW: http://www.swi-prolog.org - Copyright (c) 2015, VU University Amsterdam + Copyright (c) 2017, VU University Amsterdam All rights reserved. Redistribution and use in source and binary forms, with or without @@ -33,7 +33,7 @@ */ :- module(swish_logging, - [ + [ create_log_dir/0 ]). :- use_module(library(http/http_log)). :- use_module(library(broadcast)). @@ -98,3 +98,31 @@ gc_text_hash :- fail ; true ). + +%! create_log_dir +% +% Create the directory for holding the log files + +create_log_dir :- + setting(http:logfile, Term), + directory_spec(Term, DirSpec), + ( absolute_file_name(DirSpec, _, + [ file_type(directory), + access(write), + file_errors(fail) + ]) + -> true + ; absolute_file_name(DirSpec, Dir, + [ solutions(all) + ]), + catch(make_directory(Dir), _, fail) + -> true + ). + +directory_spec(Atom, Dir) :- + atomic(Atom), !, + file_directory_name(Atom, Dir). +directory_spec(Term, DirTerm) :- + Term =.. [Alias,Atom], + file_directory_name(Atom, Dir), + DirTerm =.. [Alias,Dir]. diff --git a/lib/swish/markdown.pl b/lib/swish/markdown.pl index 2b14220..6792634 100644 --- a/lib/swish/markdown.pl +++ b/lib/swish/markdown.pl @@ -104,6 +104,7 @@ wiki_file_codes_to_dom(String, File, DOM) :- prolog:doc_autolink_extension/2. prolog:doc_autolink_extension(swinb, notebook). +prolog:doc_autolink_extension(lnk, permalink). :- public file//2. @@ -133,7 +134,7 @@ file(File, Options) --> html(a([class([alias,file]), href(HREF)], Label)). file(File, Options) --> { storage_file(File), - option(label(Label), Options), + option(label(Label), Options, File), http_location_by_id(swish, Swish), directory_file_path(Swish, p, StoreDir), directory_file_path(StoreDir, File, HREF) diff --git a/lib/swish/page.pl b/lib/swish/page.pl index e6ad75e..f25ad8a 100644 --- a/lib/swish/page.pl +++ b/lib/swish/page.pl @@ -34,6 +34,7 @@ :- module(swish_page, [ swish_reply/2, % +Options, +Request + swish_reply_resource/1, % +Request swish_page//1, % +Options swish_navbar//1, % +Options @@ -107,17 +108,23 @@ http:location(pldoc, swish(pldoc), [priority(100)]). % Use Query as the initial query. % - show_beware(Boolean) % Control showing the _beware limited edition_ warning. +% - preserve_state(Boolean) +% If `true`, save state on unload and restore old state on load. swish_reply(Options, Request) :- - authenticate(Request, Auth), - swish_reply2([identity(Auth)|Options], Request). + ( option(identity(_), Options) + -> Options2 = Options + ; authenticate(Request, Auth), + Options2 = [identity(Auth)|Options] + ), + swish_reply2(Options2, Request). swish_reply2(Options, Request) :- option(method(Method), Request), Method \== get, Method \== head, !, swish_rest_reply(Method, Request, Options). swish_reply2(_, Request) :- - serve_resource(Request), !. + swish_reply_resource(Request), !. swish_reply2(Options, Request) :- swish_reply_config(Request, Options), !. swish_reply2(SwishOptions, Request) :- @@ -130,11 +137,12 @@ swish_reply2(SwishOptions, Request) :- ], http_parameters(Request, Params), params_options(Params, Options0), - merge_options(Options0, SwishOptions, Options1), - add_show_beware(Options1, Options2), - source_option(Request, Options2, Options3), - option(format(Format), Options3), - swish_reply3(Format, Options3). + add_show_beware(Options0, Options1), + add_preserve_state(Options1, Options2), + merge_options(Options2, SwishOptions, Options3), + source_option(Request, Options3, Options4), + option(format(Format), Options4), + swish_reply3(Format, Options4). swish_reply3(raw, Options) :- option(code(Code), Options), !, @@ -194,6 +202,18 @@ implicit_no_show_beware(Options) :- implicit_no_show_beware(Options) :- option(background(_), Options). +%! add_preserve_state(+Options0, -Option) is det. +% +% Add preserve_state(false) when called with code. + +add_preserve_state(Options0, Options) :- + option(preserve_state(_), Options0), !, + Options = Options0. +add_preserve_state(Options0, Options) :- + option(code(_), Options0), !, + Options = [preserve_state(false)|Options0]. +add_preserve_state(Options, Options). + %% source_option(+Request, +Options0, -Options) % @@ -301,11 +321,11 @@ confirm_access(_, _). eval_condition(loaded, Path) :- source_file(Path). -%% serve_resource(+Request) is semidet. +%% swish_reply_resource(+Request) is semidet. % % Serve /swish/Resource files. -serve_resource(Request) :- +swish_reply_resource(Request) :- option(path_info(Info), Request), resource_prefix(Prefix), sub_atom(Info, 0, _, _, Prefix), !, @@ -452,15 +472,26 @@ swish_config_hash(Options) --> % The options are set per session. swish_options(Options) --> - { option(show_beware(Show), Options), - JSShow = @(Show) - }, !, - js_script({|javascript(JSShow)|| + js_script({|javascript|| window.swish = window.swish||{}; - window.swish.option = window.swish.options||{}; - window.swish.option.show_beware = JSShow; + window.swish.option = window.swish.option||{}; + |}), + swish_options([show_beware, preserve_state], Options). + +swish_options([], _) --> []. +swish_options([H|T], Options) --> + swish_option(H, Options), + swish_options(T, Options). + +swish_option(Name, Options) --> + { Opt =.. [Name,Val], + option(Opt, Options), + JSVal = @(Val) + }, !, + js_script({|javascript(Name, JSVal)|| + window.swish.option[Name] = JSVal; |}). -swish_options(_Options) --> +swish_option(_, _) --> []. %% source(+Type, +Options)// diff --git a/lib/swish/paths.pl b/lib/swish/paths.pl index cda2500..2d3364b 100644 --- a/lib/swish/paths.pl +++ b/lib/swish/paths.pl @@ -3,8 +3,8 @@ Author: Jan Wielemaker E-mail: J.Wielemaker@cs.vu.nl WWW: http://www.swi-prolog.org - Copyright (C): 2017, VU University Amsterdam - CWI Amsterdam + Copyright (C): 2017-2018, VU University Amsterdam + CWI Amsterdam All rights reserved. Redistribution and use in source and binary forms, with or without @@ -58,14 +58,17 @@ user:file_search_path(icons, swish_web(icons)). %! set_swish_path % -% Setup the swish search path. +% Setup the `swish` search path. set_swish_path :- absolute_file_name(swish('swish.pl'), _, [file_errors(fail), access(read)]), !. set_swish_path :- prolog_load_context(directory, Dir), + !, asserta(user:file_search_path(swish, Dir)). +set_swish_path :- + current_prolog_flag(saved_program, true). %! attach_local_packs % diff --git a/lib/swish/plugin/notify.pl b/lib/swish/plugin/notify.pl index 0d1d121..e8ab9ee 100644 --- a/lib/swish/plugin/notify.pl +++ b/lib/swish/plugin/notify.pl @@ -251,6 +251,8 @@ nofollow(DocID, ProfileID, Flags) :- % Gitty file was deleted % - forked(OldCommit, Commit) % Gitty file was forked +% - created(Commit) +% A new gitty file was created % - chat(Message) % A chat message was sent. Message is the JSON content as a dict. % Message contains a `docid` key. @@ -298,15 +300,18 @@ notify_event(follow(DocID, ProfileID, Options)) :- follow(DocID, ProfileID, Options). % events on gitty files notify_event(updated(File, Commit)) :- - atom_concat('gitty:', File, DocID), - notify(DocID, updated(Commit)). + ( storage_meta_data(Commit.get(previous), OldCommit), + atom_concat('gitty:', OldCommit.name, DocID) + -> notify(DocID, forked(OldCommit, Commit)) + ; atom_concat('gitty:', File, DocID), + notify(DocID, updated(Commit)) + ). notify_event(deleted(File, Commit)) :- atom_concat('gitty:', File, DocID), notify(DocID, deleted(Commit)). -notify_event(created(_File, Commit)) :- - storage_meta_data(Commit.get(previous), Meta), - atom_concat('gitty:', Meta.name, DocID), - notify(DocID, forked(Meta, Commit)). +notify_event(created(File, Commit)) :- + atom_concat('gitty:', File, DocID), + notify(DocID, created(Commit)). % chat message notify_event(chat(Message)) :- notify(Message.docid, chat(Message)). @@ -385,6 +390,10 @@ chat_notice(forked(_OldCommit, Commit), []) --> html([b('Forked'), ' into ', \file_name(Commit), ': ', \commit_message_summary(Commit) ]). +chat_notice(created(Commit), []) --> + html([b('Created'), ' ', \file_name(Commit), ': ', + \commit_message_summary(Commit) + ]). commit_message_summary(Commit) --> { Message = Commit.get(commit_message) }, !, diff --git a/lib/swish/projection.pl b/lib/swish/projection.pl index 3cc6939..d35bffa 100644 --- a/lib/swish/projection.pl +++ b/lib/swish/projection.pl @@ -50,25 +50,50 @@ set. for efficiency reasons. */ +:- multifile + reserved_var/1. % +VarName + %! projection(+Spec:list) % % Specify the result variables. Using projection/1 at the start of a % query specifies which variables are part of the result set, in what % order they are displayed and, optionally, whether the results must -% be ordered on one or more variables. Ordering is specified using -% `+Var` (ascending) or `-Var` (descending). If ordering is specified -% for multiple variables, the result set is ordered starting with the -% left-most variable for which ordering is defined. +% be ordered on one or more variables or the solutions should be +% distinct. Each element of Spec is one of the following: +% +% - Var +% Include Var in the result. Var must appear in the remainder of +% the body. +% - Var:Annotation +% As Var, respecting Annotation. Valid annotations are below. +% Annotations may be abbreviated, e.g. `asc`, `desc` +% +% - ascending +% Order solutions in ascending order. +% - descending +% Order solutions is descending order +% - distinct +% Remove duplicates wrt. this argument. +% - AnnA+AnnB +% Multiple annotations +% +% - +Var +% Equivalent to `Var:ascending` +% - -Var +% Equivalent to `Var:descending` +% +% If ordering is specified for multiple variables, the result set is +% ordered starting with the left-most variable for which ordering is +% defined. projection(_). -swish:goal_expansion((Projection,Body), Ordered) :- +swish:goal_expansion((Projection,Body), Aggregate) :- nonvar(Projection), Projection = projection(Spec), must_be(list, Spec), - phrase(order(Spec, Vars), Order), - Order \== [], - Ordered = order_by(Order, Body), + aggregation(Spec, Vars, Body, Aggregate), + !, ignore(set_projection(Vars)). swish:goal_expansion(projection(Vars), true) :- set_projection(Vars). @@ -76,26 +101,97 @@ swish:goal_expansion(projection(Vars), true) :- set_projection(Vars) :- nb_current('$variable_names', Bindings), debug(projection, 'Got ~p; Asking ~p', [Bindings, Vars]), - memberchk('_residuals'=Var, Bindings), - maplist(select_binding(Bindings), Vars, NewBindings), - debug(projection, 'Filtered ~p', [NewBindings]), - b_setval('$variable_names', ['_residuals'=Var|NewBindings]). + preverse_vars(Bindings, NewVars, SelectedBindings), + maplist(select_binding(Bindings), Vars, SelectedBindings), + debug(projection, 'Filtered ~p', [NewVars]), + b_setval('$variable_names', NewVars). + +%! preverse_vars(+Bindings, -ReservedBindings, ?Tail) is det. +% +% Preserve some of the _pseudo bindings_ that communicate additional +% information from the Pengine. May be extended by adding clauses to +% reserved_var/1. + +preverse_vars([], L, L). +preverse_vars([Name=Var|T0], [Name=Var|T], L) :- + reserved_var(Name), + !, + preverse_vars(T0, T, L). +preverse_vars([_|T0], T, L) :- + preverse_vars(T0, T, L). + +reserved_var('_residuals'). +reserved_var('_swish__permahash'). select_binding(Bindings, Var, Name=Var) :- member(Name=X, Bindings), Var == X, !. -order([], []) --> +%! aggregation(+Projection:list, -Vars, +Goal0, -Goal) is semidet. +% +% Determine the final projection variables as well as ordering and +% distinct wrapper from the projection argument. + +aggregation(Projection, Vars, Goal0, Goal) :- + modifiers(Projection, Vars, Unique, Order), + munique(Unique, Goal0, Goal1), + morder(Order, Goal1, Goal). + +munique([], Goal, Goal) :- + !. +munique(Vars, Goal0, distinct(Term, Goal0)) :- + Term =.. [v|Vars]. + +morder([], Goal, Goal) :- + !. +morder(Vars, Goal0, order_by(Vars, Goal0)). + +modifiers(Projection, Vars, Unique, Order) :- + phrase(annotations(Projection, Vars), Annot), + Annot \== [], + partition(unique_anot, Annot, Unique, Order). + +unique_anot(distinct(_)). + +annotations([], []) --> []. -order([H|T], [V|VT]) --> - order1(H, V), - order(T, VT). - -order1(V, V) --> {var(V)}, !. -order1(+V, V) --> !, [asc(V)]. -order1(-V, V) --> !, [desc(V)]. -order1(V, V) --> []. +annotations([H|T], [V|VT]) --> + annotations1(H, V), + annotations(T, VT). + +annotations1(V, V) --> {var(V)}, !. +annotations1(+V, V) --> !, [asc(V)]. +annotations1(-V, V) --> !, [desc(V)]. +annotations1(V:Ann, V) --> !, var_annotations(Ann, V). +annotations1(V, V) --> []. + +var_annotations(Var, _) --> {var(Var), instantiation_error(Var)}. +var_annotations(A+B, V) --> !, var_annotations(A,V), var_annotations(B,V). +var_annotations(Anot, V) --> + { var_annotation(Anot, Can), + Term =.. [Can,V] + }, + [ Term ]. + +var_annotation(Anot, Cann) :- + var_anot1(Anot, Cann), + !, + ( var_anot1(Anot, Cann2), + Cann \== Cann2 + -> domain_error(projection_annotation, Anot) + ; true + ). +var_annotation(Anot, _Cann) :- + domain_error(projection_annotation, Anot). + +var_anot1(Anot, Cann) :- + var_annot(Long, Cann), + sub_atom(Long, 0, _, _, Anot). + +var_annot(ascending, asc). +var_annot(descending, desc). +var_annot(distinct, distinct). :- multifile sandbox:safe_primitive/1. diff --git a/lib/swish/provenance.pl b/lib/swish/provenance.pl new file mode 100644 index 0000000..b1d5afb --- /dev/null +++ b/lib/swish/provenance.pl @@ -0,0 +1,306 @@ +/* Part of SWISH + + Author: Jan Wielemaker + E-mail: J.Wielemaker@cs.vu.nl + WWW: http://www.swi-prolog.org + Copyright (C): 2017, VU University Amsterdam + CWI Amsterdam + All rights reserved. + + Redistribution and use in source and binary forms, with or without + modification, are permitted provided that the following conditions + are met: + + 1. Redistributions of source code must retain the above copyright + notice, this list of conditions and the following disclaimer. + + 2. Redistributions in binary form must reproduce the above copyright + notice, this list of conditions and the following disclaimer in + the documentation and/or other materials provided with the + distribution. + + THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS + "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT + LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS + FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE + COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, + INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, + BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; + LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER + CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT + LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN + ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE + POSSIBILITY OF SUCH DAMAGE. +*/ + +:- module(swish_provenance, + [ swish_provenance/2, % :Goal, -Provenance + permahash/2, % :Goal, -Hash + current_permahash/3 % ?Name, -Meta, -Hash + ]). +:- use_module(library(apply)). +:- use_module(library(pengines)). +:- use_module(library(lists)). +:- use_module(library(option)). +:- use_module(library(signature)). % from pcache pack +:- use_module(config). +:- use_module(page). +:- use_module(storage). +:- use_module(gitty). +:- use_module(authenticate). +:- use_module(pep). +:- use_module(library(http/http_dispatch)). + +:- meta_predicate + swish_provenance(:, -), + permahash(:, -). + +/** <module> SWISH provenance collection + +This module provides persistent hashes for a goal and its dependencies. +*/ + +:- http_handler(swish('q/'), permalink, [ id(permalink), prefix ]). + +%! permahash(:Goal, -Hash) is det. +% +% Create a hash for Goal and its dependencies. + +permahash(M:Goal, Hash) :- + goal_string(Goal, String), + swish_provenance(M:Goal, Provenance), + storage_store_term(Provenance, ProvHash), + storage_store_term(goal{goal:String, + prov:ProvHash}, Hash). + +goal_string(Goal, String) :- + State = state(''), + ( nb_current('$variable_names', Bindings), + maplist(bind, Bindings), + with_output_to(string(String), portray_clause(Goal)), + nb_setarg(1, State, String), + fail + ; arg(1, State, String) + ). + +bind(Name=Var) :- + Var = '$VAR'(Name). + +%! swish_provenance(:Goal, -Provenance:dict) is det. +% +% Provide provenance information for running Goal. + +swish_provenance(Goal, Provenance) :- + goal_provenance(Goal, Prov0), + ( select(SourceID-Preds, Prov0, Prov1), + split_string(SourceID, "/", "/", ["pengine:", IdS, "src"]), + atom_string(Module, IdS), + Preds \== [] + -> local_source(Module, SourceID, Preds, Source), + convlist(file_prov(Module), Prov1, Used), + ( Used \== [] + -> Provenance = prov{ local: Source, + import: Used } + ; Provenance = prov{ local: Source } + ) + ; Goal = M:_, + convlist(file_prov(M), Prov0, Used), + Used \== [] + -> Provenance = prov{ import: Used } + ; Provenance = prov{} + ). + +file_prov(Module, URI-Preds0, Hash-Preds) :- + current_predicate(Module:'swish included'/1), + Module:'swish included'(gitty(CommitHash, _DataHash, URI)), + !, + Hash = CommitHash, + maplist(unqualify(Module), Preds0, Preds). + +unqualify(M, Pred0, Pred) :- + Pred0.head = M:Plain, + Pred = Pred0.put(head, Plain). + +local_source(Module, SourceID, Preds0, + [source{gitty:Hash, predicates:Preds}]) :- + pengine_self(Me), + pengine_property(Me, source(SourceID, Source)), + !, + storage_store_term(source{text:Source}, Hash), + maplist(unqualify(Module), Preds0, Preds). +local_source(_Module, _SourceID, Preds, Source) :- + maplist(local_def, Preds, Source). + +local_def(Pred0, predicate{head:Head, clauses:Clauses}) :- + M:Head = Pred0.head, + findall(Clause, clause_of(M:Head, Clause), Clauses). + +clause_of(M:Pred, Clause) :- + clause(M:Pred, Body), + ( Body == true + -> Clause = Pred + ; Clause = (Pred :- Body) + ). + + /******************************* + * HOOK FOR .LNK * + *******************************/ + +:- multifile + web_storage:open_hook/3. + +web_storage:open_hook(swish, Options0, Options) :- + option(meta(Meta), Options0), + file_name_extension(_, lnk, Meta.get(name)), + option(code(HashCode), Options0), + atom_string(Hash, HashCode), + is_gitty_hash(Hash), + !, + permalink_code_query(Hash, Code, Query), + merge_options([ code(Code), + q(Query), + type(pl), + show_beware(false) + ], + Options0, Options). +web_storage:open_hook(json, JSON0, JSON) :- + file_name_extension(_, lnk, JSON0.get(meta).get(name)), + atom_string(Hash, JSON0.get(data)), + is_gitty_hash(Hash), + !, + permalink_code_query(Hash, Code, Query), + JSON = JSON0.put(_{data:Code, query:Query}). + + + /******************************* + * RESTORE A PERMALINK * + *******************************/ + +%! permalink(+Request) +% +% Open a query and source from a permalink. Normally mounted on +% `/q/hash`. + +permalink(Request) :- + authenticate(Request, Auth), + permalink(Request, [identity(Auth)]). + +permalink(Request, _Options) :- + swish_reply_resource(Request), !. +permalink(Request, Options) :- + swish_reply_config(Request, Options), !. +permalink(Request, Options) :- + option(path_info(Hash), Request), + is_gitty_hash(Hash), + authorized(gitty(download(Hash, permalink)), Options), + permalink_code_query(Hash, Code, Query), + swish_reply([ code(Code), + q(Query), + show_beware(false) + | Options + ], + Request). + +permalink_code_query(Hash, Code, Query) :- + storage_load_term(Hash, PermaData), + _{goal:Query, prov:Prov} :< PermaData, + storage_load_term(Prov, ProvData), + ( _{local:Local} :< ProvData + -> maplist(source, Local, Sources), + atomics_to_string(Sources, "\n", Code0), + import_versions(ProvData, Code0, Code) + ; import_versions(ProvData, "", Code) + ). + +source(Prov, Source) :- + storage_load_term(Prov.get(gitty), LocalData), + Source = LocalData.get(text). + +%! import_versions(+ProvData, +Code0, -Code) is det. +% +% If there are imported files, check that their versions are current. +% If not, prepend an include statement to import the right version. + +import_versions(ProvData, Code0, Code) :- + _{import:Import} :< ProvData, + convlist(import_version(Code0), Import, Strings), + Strings \== [], + !, + append(Strings, [Code0], AllStrings), + atomics_to_string(AllStrings, "\n", Code). +import_versions(_, Code, Code). + +%! import_version(+Code0, +Import, -String) is semidet. +% +% If Import is not the HEAD of the imported file, unify String with a +% header that includes the specific version. +% +% @tbd We also have the hashes of the predicates from the imported +% files that are used. We could use this to verify that the imported +% predicates have not changed and therefore we can (still) import the +% HEAD rather than the specific version. +% +% @tbd Currently assumes there is either no local source (2nd clause) +% or the local source contains all required `:- include`. Is that +% true? + +import_version(Code0, Hash-_Predicates, String) :- + storage_meta_data(Hash, Meta), + import_version(Code0, Hash, Meta, String). + +import_version(_Code0, Hash, Meta, String) :- + \+ Meta.get(symbolic) == "HEAD", !, + format_time(string(Date), '%+', Meta.time), + file_name_extension(Base, _, Meta.name), + format(string(String), + '% Permalink: using "~w" from ~s\n\c + :- include(~q, [version(~q)]).', + [ Base, Date, Base, Hash ]). +import_version("", _Hash, Meta, String) :- + file_name_extension(Base, _, Meta.name), + format(string(String), + '% Permalink: using current version of "~w"\n\c + :- include(~q).', + [ Base, Base ]). + + + /******************************* + * ENUMERATE * + *******************************/ + +%! current_permahash(?Name, -Meta, -Hash) is nondet. +% +% Enumerate saved permahashes. +% +% @arg Name is the name of the permahash file +% @arg Meta is the meta data of this file (author, time, tags, etc.) +% @arg Hash is the permahash + +current_permahash(Name, Hash, Meta) :- + storage_file_extension(Name, lnk), + storage_file(Name, HashString, Meta), + atom_string(Hash, HashString), + is_gitty_hash(Hash). + + + /******************************* + * HOOK * + *******************************/ + +:- multifile + swish_config:config/2, + swish_trace:pre_context/3. + +swish_config:config(permahash_var, '_swish__permahash'). + +swish_trace:pre_context('_swish__permahash', Goal, Hash) :- + permahash(Goal, Hash). + + + /******************************* + * SANDBOX * + *******************************/ + +:- multifile sandbox:safe_meta_predicate/1. + +sandbox:safe_meta_predicate(swish_provenance:permahash/2). diff --git a/lib/swish/render/c3.pl b/lib/swish/render/c3.pl index fe6b969..7127109 100644 --- a/lib/swish/render/c3.pl +++ b/lib/swish/render/c3.pl @@ -153,12 +153,19 @@ valid_c3_data(Data0, Data) :- -> Data0 = Data ; Data = Data0.put(rows,Rows) ). -valid_c3_data(Data, Data) :- - Columns = Data.get(columns), !, - must_be(acyclic, Columns), - must_be(list,Columns), - maplist(must_be(list),Columns). - +valid_c3_data(Data0, Data) :- + Columns0 = Data0.get(columns), !, + must_be(acyclic, Columns0), + ( rows_to_matrix(Columns0, Columns) + -> true + ; maplist(is_list, Columns0) + -> Columns = Columns0 + ), + must_be(list(ground), Columns), + ( same_term(Columns0, Columns) + -> Data0 = Data + ; Data = Data0.put(columns,Columns) + ). valid_c3_data(Data, Data) :- throw(error(c3_no_data(Data), _)). @@ -182,7 +189,8 @@ rows_to_matrix(Dicts, [Keys|Rows]) :- maplist(compound_arguments, Compounds, Rows). :- endif. rows_to_matrix(Compounds, Rows) :- - dif(Name/Arity, []/2), % avoid lists + functor([_], F, A), + dif(Name/Arity, F/A), % avoid lists maplist(name_arity_compound(Name, Arity), Compounds, Rows), !. rows_to_matrix(Lists, Lists) :- maplist(length_list(_Columns), Lists). diff --git a/lib/swish/render/wordnet.pl b/lib/swish/render/wordnet.pl new file mode 100644 index 0000000..e478397 --- /dev/null +++ b/lib/swish/render/wordnet.pl @@ -0,0 +1,79 @@ +/* Part of SWISH + + Author: Jan Wielemaker + E-mail: J.Wielemaker@vu.nl + WWW: http://www.swi-prolog.org + Copyright (c) 2017, VU University Amsterdam + All rights reserved. + + Redistribution and use in source and binary forms, with or without + modification, are permitted provided that the following conditions + are met: + + 1. Redistributions of source code must retain the above copyright + notice, this list of conditions and the following disclaimer. + + 2. Redistributions in binary form must reproduce the above copyright + notice, this list of conditions and the following disclaimer in + the documentation and/or other materials provided with the + distribution. + + THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS + "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT + LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS + FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE + COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, + INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, + BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; + LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER + CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT + LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN + ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE + POSSIBILITY OF SUCH DAMAGE. +*/ + +:- module(swish_render_wordnet, + [ term_rendering//3 % +Term, +Vars, +Options + ]). +:- use_module(library(http/html_write)). +:- use_module('../render'). +:- use_module(library(wn)). % from wordnet pack + +:- register_renderer(wordnet, "Show WordNet synsets"). + +/** <module> SWISH wordnet renderer + +Renders a WordNet synset-id (integer) as a list of words. +*/ + +%% term_rendering(+Synset, +Vars, +Options)// +% +% Renders a Synset as a list of words. + +term_rendering(Synset, _Vars, _Options) --> + { integer(Synset), + Synset > 100000000, + Synset < 500000000, + findall(Word, wn_s(Synset, _, Word, _, _, _), Words), + Words \== [], + ( wn_g(Synset, Gloss) + -> Attr = [title(Gloss)] + ; Attr = [] + ) + }, + html(span([class(synset)|Attr], + [ span(class('synset-id'), Synset), ' (WN: ', + \words(Words), ')' + ])). + +words([]) --> []. +words([H|T]) --> + word(H), + ( {T == []} + -> [] + ; html(', '), + words(T) + ). + +word(H) --> + html(span(class('wn-word'), H)). diff --git a/lib/swish/search.pl b/lib/swish/search.pl index 297f690..f70883d 100644 --- a/lib/swish/search.pl +++ b/lib/swish/search.pl @@ -3,7 +3,7 @@ Author: Jan Wielemaker E-mail: J.Wielemaker@vu.nl WWW: http://www.swi-prolog.org - Copyright (c) 2015-2017, VU University Amsterdam + Copyright (c) 2015-2018, VU University Amsterdam All rights reserved. Redistribution and use in source and binary forms, with or without @@ -43,6 +43,7 @@ :- use_module(library(http/http_json)). :- use_module(library(prolog_source)). :- use_module(library(option)). +:- use_module(library(debug)). :- use_module(library(solution_sequences)). :- use_module(config). @@ -63,7 +64,6 @@ search from the server side. What do we want to search for? */ :- http_handler(swish(typeahead), typeahead, [id(swish_typeahead)]). -:- http_handler(swish(search), search, [id(swish_search)]). %% search_box(+Options)// % @@ -193,14 +193,3 @@ sow(Text, Offset) :- ; char_type(Start, upper), char_type(Before, lower) ), !. - -%% search(+Request) -% -% Handle an actual search request from the SWISH search box. -% Returns an HTML document with the actual results that is -% displayed in a modal dialog. - -search(_Request) :- - reply_html_page(search, - [], - h1('Search results')). diff --git a/lib/swish/storage.pl b/lib/swish/storage.pl index c3c7d35..77cabe7 100644 --- a/lib/swish/storage.pl +++ b/lib/swish/storage.pl @@ -3,7 +3,7 @@ Author: Jan Wielemaker E-mail: J.Wielemaker@vu.nl WWW: http://www.swi-prolog.org - Copyright (c) 2014-2017, VU University Amsterdam + Copyright (c) 2014-2018, VU University Amsterdam All rights reserved. Redistribution and use in source and binary forms, with or without @@ -34,9 +34,18 @@ :- module(web_storage, [ storage_file/1, % ?File + storage_file_extension/2, % ?File, ?Extension storage_file/3, % +File, -Data, -Meta storage_meta_data/2, % +File, -Meta - storage_meta_property/2 % +Meta, ?Property + storage_meta_property/2, % +Meta, ?Property + + storage_fsck/0, + storage_repack/0, + storage_repack/1, % +Options + storage_unpack/0, + + storage_store_term/2, % +Term, -Hash + storage_load_term/2 % +Hash, -Term ]). :- use_module(library(http/http_dispatch)). :- use_module(library(http/http_parameters)). @@ -51,6 +60,8 @@ :- use_module(library(broadcast)). :- use_module(library(readutil)). :- use_module(library(solution_sequences)). +:- use_module(library(dcg/basics)). +:- use_module(library(pcre)). :- use_module(page). :- use_module(gitty). @@ -71,16 +82,32 @@ their own version. :- setting(directory, callable, data(storage), 'The directory for storing files.'). -:- http_handler(swish('p/'), web_storage, [ id(web_storage), prefix ]). +:- http_handler(swish('p/'), + web_storage, + [ id(web_storage), prefix ]). +:- http_handler(swish('source_list'), + source_list, + [ id(source_list) ]). +:- http_handler(swish('source_modified'), + source_modified, + [ id(source_modified) ]). -:- initialization open_gittystore. % TBD: make this lazy? +:- listen(http(pre_server_start), + open_gittystore(_)). :- dynamic storage_dir/1. :- volatile storage_dir/1. -open_gittystore :- - storage_dir(_), !. -open_gittystore :- +open_gittystore(Dir0) :- + storage_dir(Dir), !, + Dir = Dir0. +open_gittystore(Dir) :- + with_mutex(web_storage, open_gittystore_guarded(Dir0)), + Dir = Dir0. + +open_gittystore_guarded(Dir) :- + storage_dir(Dir), !. +open_gittystore_guarded(Dir) :- setting(directory, Spec), absolute_file_name(Spec, Dir, [ file_type(directory), @@ -89,7 +116,7 @@ open_gittystore :- ]), !, gitty_open(Dir, []), asserta(storage_dir(Dir)). -open_gittystore :- +open_gittystore_guarded(Dir) :- setting(directory, Spec), absolute_file_name(Spec, Dir, [ solutions(all) @@ -119,6 +146,7 @@ create_store(Dir) :- web_storage(Request) :- authenticate(Request, Auth), option(method(Method), Request), + open_gittystore(_), storage(Method, Request, [identity(Auth)]). :- multifile @@ -195,7 +223,10 @@ storage(put, Request, Options) :- request_file(Request, Dir, File), ( Dict.get(update) == "meta-data" -> gitty_data(Dir, File, Data, _OldMeta) - ; option(data(Data), Dict, "") + ; writeable(File) + -> option(data(Data), Dict, "") + ; option(path(Path), Request), + throw(http_reply(forbidden(Path))) ), meta_data(Dir, Dict, PrevMeta, Meta, Options), storage_url(File, URL), @@ -222,6 +253,9 @@ storage(delete, Request, Options) :- broadcast(swish(deleted(File, Commit))), reply_json_dict(true). +writeable(File) :- + \+ file_name_extension(_, lnk, File). + %% update_error(+Error, +Storage, +Data, +File, +URL) % % If error signals an edit conflict, prepare an HTTP =|409 @@ -388,13 +422,13 @@ storage_get(Request, Format, Options) :- storage_get(swish, Dir, Type, FileOrHash, Request) :- gitty_data_or_default(Dir, Type, FileOrHash, Code, Meta), chat_count(Meta, Count), - swish_reply([ code(Code), - file(FileOrHash), - st_type(gitty), - meta(Meta), - chat_count(Count) - ], - Request). + swish_show([ code(Code), + file(FileOrHash), + st_type(gitty), + meta(Meta), + chat_count(Count) + ], + Request). storage_get(raw, Dir, Type, FileOrHash, _Request) :- gitty_data_or_default(Dir, Type, FileOrHash, Code, Meta), file_mime_type(Meta.name, MIME), @@ -403,7 +437,12 @@ storage_get(raw, Dir, Type, FileOrHash, _Request) :- storage_get(json, Dir, Type, FileOrHash, _Request) :- gitty_data_or_default(Dir, Type, FileOrHash, Code, Meta), chat_count(Meta, Count), - reply_json_dict(json{data:Code, meta:Meta, chats:_{total:Count}}). + JSON0 = json{data:Code, meta:Meta, chats:_{total:Count}}, + ( open_hook(json, JSON0, JSON) + -> true + ; JSON = JSON0 + ), + reply_json_dict(JSON). storage_get(history(Depth, Includes), Dir, _, File, _Request) :- gitty_history(Dir, File, History, [depth(Depth),includes(Includes)]), reply_json_dict(History). @@ -482,11 +521,26 @@ random_char(Char) :- sub_atom(From, I, 1, _, Char). +%! swish_show(+Options, +Request) +% +% Hande a document. First calls the hook open_hook/2 to rewrite +% the document. This is used for e.g., permahashes. + +:- multifile open_hook/3. + +swish_show(Options0, Request) :- + open_hook(swish, Options0, Options), !, + swish_reply(Options, Request). +swish_show(Options, Request) :- + swish_reply(Options, Request). + + /******************************* * INTERFACE * *******************************/ -%% storage_file(?File) is semidet. +%% storage_file(?File) is nondet. +%! storage_file_extension(?File, ?Extension) is nondet. %% storage_file(+File, -Data, -Meta) is semidet. %% storage_meta_data(+File, -Meta) is semidet. % @@ -496,15 +550,22 @@ random_char(Char) :- % @arg Meta is a dict holding the meta data about the file. storage_file(File) :- - storage_dir(Dir), - gitty_file(Dir, File, _Head). + storage_file_extension(File, _). + +storage_file_extension(File, Ext) :- + open_gittystore(Dir), + gitty_file(Dir, File, Ext, _Head). storage_file(File, Data, Meta) :- - storage_dir(Dir), + open_gittystore(Dir), + ( var(File) + -> gitty_file(Dir, File, _Head) + ; true + ), gitty_data(Dir, File, Data, Meta). storage_meta_data(File, Meta) :- - storage_dir(Dir), + open_gittystore(Dir), ( var(File) -> gitty_file(Dir, File, _Head) ; true @@ -532,12 +593,65 @@ meta_property(modify(Modify), _, Meta) :- ; Modify = [any,login,owner] ). -current_meta_property(peer(_Atom), dict). -current_meta_property(public(_Bool), dict). -current_meta_property(time(_Seconds), dict). -current_meta_property(author(_String), dict). -current_meta_property(avatar(_String), dict). -current_meta_property(modify(_List), derived). +current_meta_property(peer(_Atom), dict). +current_meta_property(public(_Bool), dict). +current_meta_property(time(_Seconds), dict). +current_meta_property(author(_String), dict). +current_meta_property(identity(_String), dict). +current_meta_property(avatar(_String), dict). +current_meta_property(modify(_List), derived). + +%! storage_store_term(+Term, -Hash) is det. +%! storage_load_term(+Hash, -Term) is det. +% +% Add/retrieve terms from the gitty store. This is used to create +% permanent links to arbitrary objects. + +storage_store_term(Term, Hash) :- + open_gittystore(Dir), + with_output_to(string(S), write_canonical(Term)), + gitty_save(Dir, S, term, Hash). + +storage_load_term(Hash, Term) :- + open_gittystore(Dir), + gitty_load(Dir, Hash, Data, term), + term_string(Term, Data). + + + /******************************* + * MAINTENANCE * + *******************************/ + +%! storage_fsck +% +% Enumerate and check the consistency of the entire store. + +storage_fsck :- + open_gittystore(Dir), + gitty_fsck(Dir). + +%! storage_repack is det. +%! storage_repack(+Options) is det. +% +% Repack the storage directory. Currently only supports the +% `files` driver. For database drivers this is supposed to be +% handled by the database. + +storage_repack :- + storage_repack([]). +storage_repack(Options) :- + open_gittystore(Dir), + ( gitty_driver(Dir, files) + -> gitty_driver_files:repack_objects(Dir, Options) + ; print_message(informational, gitty(norepack(driver))) + ). + +storage_unpack :- + open_gittystore(Dir), + ( gitty_driver(Dir, files) + -> gitty_driver_files:unpack_packs(Dir) + ; print_message(informational, gitty(nounpack(driver))) + ). /******************************* @@ -561,7 +675,7 @@ current_meta_property(modify(_List), derived). % @tbd We should only demand public on public servers. swish_search:typeahead(file, Query, FileInfo, _Options) :- - storage_dir(Dir), + open_gittystore(Dir), gitty_file(Dir, File, Head), gitty_commit(Dir, Head, Meta), Meta.get(public) == true, @@ -591,7 +705,7 @@ swish_search:typeahead(store_content, Query, FileInfo, Options) :- limit(25, search_store_content(Query, FileInfo, Options)). search_store_content(Query, FileInfo, Options) :- - storage_dir(Dir), + open_gittystore(Dir), gitty_file(Dir, File, Head), gitty_data(Dir, Head, Data, Meta), Meta.get(public) == true, @@ -605,6 +719,357 @@ search_file(File, Meta, Data, Query, FileInfo, Options) :- line:LineNo, text:Line, query:Query }). + + /******************************* + * SOURCE LIST * + *******************************/ + +%% source_list(+Request) +% +% List source files. Request parameters: +% +% - q(Query) +% Query is a string for which the following sub strings +% are treated special: +% $ "..." : +% A quoted string is taken as a string search +% $ /.../[xim]* +% Regular expression search +% $ tag:Tag : +% Must have tag containing +% $ type:Type : +% Limit to one of `pl`, `swinb` or `lnk` +% $ user:User : +% Must have user containing. If User is `me` must be +% owned by current user +% $ name:Name : +% Must have name containing +% - o(Order) +% Order by `time` (default), `name`, `author` or `type` +% - offset(+Offset) +% - limit(+Limit) +% - display_name +% - avatar +% Weak identity parameters used to identify _own_ documents +% that are also weakly identified. +% +% Reply is a JSON object containing `count` (total matches), +% `cpu` (CPU time) and `matches` (list of matching sources) +% +% @tbd Search the content when searching a .lnk file? +% @tbd Speedup expensive searches. Cache? Use external DB? + +source_list(Request) :- + authenticate(Request, Auth), + http_parameters(Request, + [ q(Q, [optional(true)]), + o(Order, [ oneof([time,name,author,type]), + default(time) + ]), + offset(Offset, [integer, default(0)]), + limit(Limit, [integer, default(10)]), + display_name(DisplayName, [optional(true), string]), + avatar(Avatar, [optional(true), string]) + ]), + bound(Auth.put(_{display_name:DisplayName, avatar:Avatar}), AuthEx), + order(Order, Field, Cmp), + last_modified(Modified), + statistics(cputime, CPU0), + findall(Source, source(Q, AuthEx, Source), AllSources), + statistics(cputime, CPU1), + length(AllSources, Count), + CPU is CPU1 - CPU0, + sort(Field, Cmp, AllSources, Ordered), + list_offset_limit(Ordered, Offset, Limit, Sources), + reply_json_dict(json{total:Count, offset:Offset, + cpu:CPU, modified:Modified, + matches:Sources}). + +list_offset_limit(List0, Offset, Limit, List) :- + list_offset(List0, Offset, List1), + list_limit(List1, Limit, List). + +list_offset([_|T0], Offset, T) :- + succ(O1, Offset), !, + list_offset(T0, O1, T). +list_offset(List, _, List). + +list_limit([H|T0], Limit, [H|T]) :- + succ(L1, Limit), !, + list_limit(T0, L1, T). +list_limit(_, _, []). + +order(type, ext, @=<) :- !. +order(time, time, @>=) :- !. +order(Field, Field, @=<). + +source(Q, Auth, Source) :- + parse_query(Q, Query), + source_q(Query, Auth, Source). + +source_q([user("me")], Auth, _Source) :- + \+ _ = Auth.get(avatar), + \+ user_property(Auth, identity(_Id)), !, + fail. +source_q(Query, Auth, Source) :- + type_constraint(Query, Query1, Type), + partition(content_query, Query1, + ContentConstraints, MetaConstraints), + storage_file_extension(File, Type), + source_data(File, Meta, Source), + visible(Meta, Auth, MetaConstraints), + maplist(matches_meta(Source, Auth), MetaConstraints), + matches_content(ContentConstraints, File). + +content_query(string(_)). +content_query(regex(_)). + +source_data(File, Meta, Source) :- + storage_meta_data(File, Meta), + file_name_extension(_, Type, File), + Info = _{time:_, tags:_, author:_, avatar:_, name:_}, + Info >:< Meta, + bound(Info, Info2), + Source = Info2.put(_{type:st_gitty, ext:Type}). + +bound(Dict0, Dict) :- + dict_pairs(Dict0, Tag, Pairs0), + include(bound, Pairs0, Pairs), + dict_pairs(Dict, Tag, Pairs). + +bound(_-V) :- nonvar(V). + +%! visible(+FileMeta, +Auth, +MetaConstraints) is semidet. + +visible(Meta, Auth, Constraints) :- + memberchk(user("me"), Constraints), + !, + owns(Auth, Meta, user(_)). +visible(Meta, _Auth, _Constraints) :- + Meta.get(public) == true, !. +visible(Meta, Auth, _Constraints) :- + owns(Auth, Meta, _). + +%! owns(+Auth, +Meta, ?How) is semidet. +% +% True if the file represented by Meta is owned by the user +% identified as Auth. If this is a strong identity we must give a +% strong answer. +% +% @tbd Weaker identity on the basis of author, avatar +% properties and/or IP properties. + +owns(Auth, Meta, user(me)) :- + storage_meta_property(Meta, identity(Id)), !, + user_property(Auth, identity(Id)). +owns(Auth, Meta, user(avatar)) :- + storage_meta_property(Meta, avatar(Id)), + user_property(Auth, avatar(Id)), !. +owns(Auth, Meta, user(nickname)) :- + Auth.get(display_name) == Meta.get(author), !. +owns(Auth, Meta, host(How)) :- % trust same host and local host + Peer = Auth.get(peer), + ( Peer == Meta.get(peer) + -> How = same + ; sub_atom(Meta.get(peer), 0, _, _, '127.0.0.') + -> How = local + ). + +%! matches_meta(+Source, +Auth, +Query) is semidet. +% +% True when Source matches the meta-data requirements + +matches_meta(Dict, _, tag(Tag)) :- !, + ( Tag == "" + -> Dict.get(tags) \== [] + ; member(Tagged, Dict.get(tags)), + match_meta(Tag, Tagged) + -> true + ). +matches_meta(Dict, _, name(Name)) :- !, + match_meta(Name, Dict.get(name)). +matches_meta(Dict, _, user(Name)) :- + ( Name \== "me" + -> match_meta(Name, Dict.get(author)) + ; true % handled in visible/3 + ). + +match_meta(regex(RE), Value) :- !, + re_match(RE, Value). +match_meta(String, Value) :- + sub_atom_icasechk(Value, _, String). + +matches_content([], _) :- !. +matches_content(Constraints, File) :- + storage_file(File, Data, _Meta), + maplist(match_content(Data), Constraints). + +match_content(Data, string(S)) :- + sub_atom_icasechk(Data, _, S), !. +match_content(Data, regex(RE)) :- + re_match(RE, Data). + +%! type_constraint(+Query0, -Query, -Type) is det. +% +% Extract the type constraints from the query as we can handle +% that efficiently. + +type_constraint(Query0, Query, Type) :- + partition(is_type, Query0, Types, Query), + ( Types == [] + -> true + ; Types = [type(Type)] + -> true + ; maplist(arg(1), Types, List), + freeze(Type, memberchk(Type, List)) + ). + +is_type(type(_)). + +%! parse_query(+String, -Query) is det. +% +% Parse a query, resulting in a list of Name(Value) pairs. Name is +% one of `tag`, `user`, `type`, `string` or `regex`. +% +% @tbd: Should we allow for logical combinations? + +parse_query(Q, Query) :- + var(Q), !, + Query = []. +parse_query(Q, Query) :- + string_codes(Q, Codes), + phrase(query(Query), Codes). + +query([H|T]) --> + blanks, + query1(H), !, + query(T). +query([]) --> + blanks. + +query1(Q) --> + tag(Tag, Value), !, + {Q =.. [Tag,Value]}. +query1(Q) --> + "\"", string(Codes), "\"", !, + { string_codes(String, Codes), + Q = string(String) + }. +query1(Q) --> + "/", string(Codes), "/", re_flags(Flags), !, + { string_codes(String, Codes), + re_compile(String, RE, Flags), + Q = regex(RE) + }. +query1(Q) --> + next_word(String), + { String \== "", + re_compile(String, RE, + [ extended(true), + caseless(true) + ]), + Q = regex(RE) + }. + +re_flags([H|T]) --> + re_flag(H), !, + re_flags(T). +re_flags([]) --> + blank. +re_flags([]) --> + eos. + +re_flag(caseless(true)) --> "i". +re_flag(extended(true)) --> "x". +re_flag(multiline(true)) --> "m". +re_flag(dotall(true)) --> "s". + +next_word(String) --> + blanks, nonblank(H), string(Codes), ( blank ; eos ), !, + { string_codes(String, [H|Codes]) }. + +tag(name, Value) --> "name:", tag_value(Value, _). +tag(tag, Value) --> "tag:", tag_value(Value, _). +tag(user, Value) --> "user:", tag_value(Value, _). +tag(type, Value) --> "type:", tag_value(String, string(_)), { atom_string(Value, String) }. + +tag_value(String, string(quoted)) --> + blanks, "\"", !, string(Codes), "\"", !, + { string_codes(String, Codes) }. +tag_value(Q, regex) --> + blanks, "/", string(Codes), "/", re_flags(Flags), !, + { Codes == [] + -> Q = "" + ; string_codes(String, Codes), + re_compile(String, RE, Flags), + Q = regex(RE) + }. +tag_value(String, string(nonquoted)) --> + nonblank(H), !, + string(Codes), + ( blank ; eos ), !, + { string_codes(String, [H|Codes]) }. +tag_value("", empty) --> + "". + + /******************************* + * TRACK CHANGES * + *******************************/ + +%! source_modified(+Request) +% +% Reply with the last modification time of the source repo. If +% there is no modification we use the time the server was started. +% +% This is a poor men's solution to keep the client cache +% consistent. Need to think about a better way to cache searches +% client and/or server side. + +source_modified(Request) :- + authenticate(Request, _Auth), + last_modified(Time), + reply_json_dict(json{modified:Time}). + +:- dynamic gitty_last_modified/1. + +update_last_modified(_,_) :- + with_mutex(gitty_last_modified, + update_last_modified_sync). + +update_last_modified_sync :- + get_time(Now), + retractall(gitty_last_modified(_)), + asserta(gitty_last_modified(Now)). + +last_modified(Time) :- + debugging(swish(sourcelist)), !, % disable caching + get_time(Now), + Time is Now + 60. +last_modified(Time) :- + with_mutex(gitty_last_modified, + last_modified_sync(Time)). + +last_modified_sync(Time) :- + ( gitty_last_modified(Time) + -> true + ; statistics(process_epoch, Time) + ). + +:- unlisten(swish(_)), + listen(swish(Event), notify_event(Event)). + +% events on gitty files +notify_event(updated(File, Commit)) :- + atom_concat('gitty:', File, DocID), + update_last_modified(Commit, DocID). +notify_event(deleted(File, Commit)) :- + atom_concat('gitty:', File, DocID), + update_last_modified(Commit, DocID). +notify_event(created(File, Commit)) :- + atom_concat('gitty:', File, DocID), + update_last_modified(Commit, DocID). + + /******************************* * MESSAGES * *******************************/ diff --git a/lib/swish/swish_debug.pl b/lib/swish/swish_debug.pl index b589ff7..d463098 100644 --- a/lib/swish/swish_debug.pl +++ b/lib/swish/swish_debug.pl @@ -95,13 +95,21 @@ stale_module_property(M, thread_status, Status) :- pengine_property(Pengine, module(M)), pengine_property(Pengine, thread(Thread)), catch(thread_property(Thread, status(Status)), _, fail). +stale_module_property(M, module_class, Class) :- + module_property(M, class(Class)). stale_module_property(M, program_space, Space) :- module_property(M, program_space(Space)). stale_module_property(M, program_size, Size) :- module_property(M, program_size(Size)). +stale_module_property(M, predicates, List) :- + current_module(M), + findall(PI, pi_in_module(M, PI), List). stale_module_property(UUID, highlight_state, State) :- current_highlight_state(UUID, State). +pi_in_module(M, Name/Arity) :- + '$c_current_predicate'(_, M:Head), + functor(Head, Name, Arity). %% swish_statistics(?State) % diff --git a/lib/swish/trace.pl b/lib/swish/trace.pl index 5fc4a73..d92604e 100644 --- a/lib/swish/trace.pl +++ b/lib/swish/trace.pl @@ -33,7 +33,7 @@ */ :- module(swish_trace, - [ '$swish wrapper'/2 % +Goal, -Residuals + [ '$swish wrapper'/2 % :Goal, ?ContextVars ]). :- use_module(library(debug)). :- use_module(library(settings)). @@ -51,6 +51,7 @@ :- use_module(library(http/html_write)). :- use_module(storage). +:- use_module(config). :- if(current_setting(swish:debug_info)). :- set_setting(swish:debug_info, true). @@ -198,16 +199,32 @@ strip_stack(error(Error, context(prolog_stack(S), Msg)), nonvar(S). strip_stack(Error, Error). -%% '$swish wrapper'(:Goal, -Residuals) +%% '$swish wrapper'(:Goal, ?ContextVars) % % Wrap a SWISH goal in '$swish wrapper'. This has two advantages: % we can detect that the tracer is operating on a SWISH goal by % inspecting the stack and we can save/restore the debug state to % deal with debugging next solutions. +% +% ContextVars is a list of variables that have a reserved name. +% The hooks pre_context/3 and post_context/3 can be used to give +% these variables a value extracted from the environment. This +% allows passing more information than just the query answers. +% +% The binding `_residuals = '$residuals'(Residuals)` is added to +% the residual goals by pengines:event_to_json/4 from +% pengines_io.pl. :- meta_predicate swish_call(0). -'$swish wrapper'(Goal, '$residuals'(Residuals)) :- +'$swish wrapper'(Goal, Extra) :- + ( nb_current('$variable_names', Bindings) + -> true + ; Bindings = [] + ), + debug(projection, 'Pre-context-pre ~p, extra=~p', [Bindings, Extra]), + maplist(call_pre_context(Goal, Bindings), Extra), + debug(projection, 'Pre-context-post ~p, extra=~p', [Bindings, Extra]), catch(swish_call(Goal), E, throw(E)), deterministic(Det), ( tracing, @@ -220,8 +237,7 @@ strip_stack(Error, Error). ) ; notrace ), - Goal = M:_, - residuals(M, Residuals). + maplist(call_post_context(Goal, Bindings), Extra). swish_call(Goal) :- Goal, @@ -232,6 +248,38 @@ no_lco. :- '$hide'(swish_call/1). :- '$hide'(no_lco/0). +%! pre_context(Name, Goal, Var) is semidet. +%! post_context(Name, Goal, Var) is semidet. +% +% Multifile hooks to extract additional information from the +% Pengine, either just before Goal is started or after an answer +% was produced. Extracting the information is triggered by +% introducing a variable with a reserved name. + +:- multifile + pre_context/3, + post_context/3. + +call_pre_context(Goal, Bindings, Var) :- + binding(Bindings, Var, Name), + pre_context(Name, Goal, Var), !. +call_pre_context(_, _, _). + + +call_post_context(Goal, Bindings, Var) :- + binding(Bindings, Var, Name), + post_context(Name, Goal, Var), !. +call_post_context(_, _, _). + +post_context(Name, M:_Goal, '$residuals'(Residuals)) :- + swish_config(residuals_var, Name), + residuals(M, Residuals). + +binding([Name=Var|_], V, Name) :- + Var == V, !. +binding([_|Bindings], V, Name) :- + binding(Bindings, V, Name). + %% residuals(+PengineModule, -Goals:list(callable)) is det. % @@ -240,10 +288,7 @@ no_lco. % goals typically live in global variables that are not visible % when formulating the answer from the projection variables as % done in library(pengines_io). -% -% This relies on the SWI-Prolog 7.3.14 residual goal extension. -:- if(current_predicate(prolog:residual_goals//0)). residuals(TypeIn, Goals) :- phrase(prolog:residual_goals, Goals0), maplist(unqualify_residual(TypeIn), Goals0, Goals). @@ -252,9 +297,6 @@ unqualify_residual(M, M:G, G) :- !. unqualify_residual(T, M:G, G) :- predicate_property(T:G, imported_from(M)), !. unqualify_residual(_, G, G). -:- else. -residuals(_, []). -:- endif. /******************************* @@ -509,10 +551,18 @@ screen_property(width(_)). screen_property(rows(_)). screen_property(cols(_)). +%! swish:tty_size(-Rows, -Cols) is det. +% +% Find the size of the output window. This is only registered when +% running _ask_. Notably during compilation it is not known. We +% provided dummy values to avoid failing. + swish:tty_size(Rows, Cols) :- pengine_self(Pengine), + current_predicate(Pengine:screen_property/1), !, Pengine:screen_property(rows(Rows)), Pengine:screen_property(cols(Cols)). +swish:tty_size(24, 80). %! set_file_breakpoints(+Pengine, +File, +Text, +Dict) % @@ -540,8 +590,8 @@ set_file_breakpoints(_Pengine, PFile, Text, Dict) :- set_pengine_breakpoint(Owner, File, Text, Line) :- debug(trace(break), 'Try break at ~q:~d', [File, Line]), line_start(Line, Text, Char), - ( set_breakpoint(Owner, File, Line, Char, Break) - -> !, debug(trace(break), 'Created breakpoint ~p', [Break]) + ( set_breakpoint(Owner, File, Line, Char, _0Break) + -> !, debug(trace(break), 'Created breakpoint ~p', [_0Break]) ; print_message(warning, breakpoint(failed(File, Line, 0))) ). @@ -634,6 +684,9 @@ prolog_clause:open_source(File, Stream) :- user:prolog_exception_hook/4, installed/1. +:- volatile + installed/1. + exception_hook(Ex, Ex, _Frame, Catcher) :- Catcher \== none, Catcher \== 'C', @@ -660,7 +713,7 @@ install_exception_hook :- exception_hook(Ex, Out, Frame, Catcher)), Ref), assert(installed(Ref)). -:- install_exception_hook. +:- initialization install_exception_hook. /******************************* @@ -668,7 +721,8 @@ install_exception_hook :- *******************************/ :- multifile - sandbox:safe_primitive/1. + sandbox:safe_primitive/1, + sandbox:safe_meta_predicate/1. sandbox:safe_primitive(system:trace). sandbox:safe_primitive(system:notrace). @@ -678,6 +732,9 @@ sandbox:safe_primitive(system:deterministic(_)). sandbox:safe_primitive(swish_trace:residuals(_,_)). sandbox:safe_primitive(swish:tty_size(_Rows, _Cols)). +sandbox:safe_meta_predicate(swish_trace:'$swish wrapper'/2). + + /******************************* * MESSAGES * *******************************/ diff --git a/web/bower_components/codemirror/mode/css/css.js b/web/bower_components/codemirror/mode/css/css.js index 7f8c46e..f5f3a41 100644 --- a/web/bower_components/codemirror/mode/css/css.js +++ b/web/bower_components/codemirror/mode/css/css.js @@ -77,9 +77,9 @@ CodeMirror.defineMode("css", function(config, parserConfig) { return ret("qualifier", "qualifier"); } else if (/[:;{}\[\]\(\)]/.test(ch)) { return ret(null, ch); - } else if ((ch == "u" && stream.match(/rl(-prefix)?\(/)) || - (ch == "d" && stream.match("omain(")) || - (ch == "r" && stream.match("egexp("))) { + } else if (((ch == "u" || ch == "U") && stream.match(/rl(-prefix)?\(/i)) || + ((ch == "d" || ch == "D") && stream.match("omain(", true, true)) || + ((ch == "r" || ch == "R") && stream.match("egexp(", true, true))) { stream.backUp(1); state.tokenize = tokenParenthesized; return ret("property", "word"); @@ -162,16 +162,16 @@ CodeMirror.defineMode("css", function(config, parserConfig) { return pushContext(state, stream, "block"); } else if (type == "}" && state.context.prev) { return popContext(state); - } else if (supportsAtComponent && /@component/.test(type)) { + } else if (supportsAtComponent && /@component/i.test(type)) { return pushContext(state, stream, "atComponentBlock"); - } else if (/^@(-moz-)?document$/.test(type)) { + } else if (/^@(-moz-)?document$/i.test(type)) { return pushContext(state, stream, "documentTypes"); - } else if (/^@(media|supports|(-moz-)?document|import)$/.test(type)) { + } else if (/^@(media|supports|(-moz-)?document|import)$/i.test(type)) { return pushContext(state, stream, "atBlock"); - } else if (/^@(font-face|counter-style)/.test(type)) { + } else if (/^@(font-face|counter-style)/i.test(type)) { state.stateArg = type; return "restricted_atBlock_before"; - } else if (/^@(-(moz|ms|o|webkit)-)?keyframes$/.test(type)) { + } else if (/^@(-(moz|ms|o|webkit)-)?keyframes$/i.test(type)) { return "keyframes"; } else if (type && type.charAt(0) == "@") { return pushContext(state, stream, "at"); @@ -383,7 +383,8 @@ CodeMirror.defineMode("css", function(config, parserConfig) { style = style[0]; } override = style; - state.state = states[state.state](type, stream, state); + if (type != "comment") + state.state = states[state.state](type, stream, state); return override; }, @@ -401,7 +402,6 @@ CodeMirror.defineMode("css", function(config, parserConfig) { ch == "{" && (cx.type == "at" || cx.type == "atBlock")) { // Dedent relative to current context. indent = Math.max(0, cx.indent - indentUnit); - cx = cx.prev; } } return indent; @@ -410,6 +410,7 @@ CodeMirror.defineMode("css", function(config, parserConfig) { electricChars: "}", blockCommentStart: "/*", blockCommentEnd: "*/", + blockCommentContinue: " * ", lineComment: lineComment, fold: "brace" }; @@ -472,7 +473,7 @@ CodeMirror.defineMode("css", function(config, parserConfig) { "border-top-left-radius", "border-top-right-radius", "border-top-style", "border-top-width", "border-width", "bottom", "box-decoration-break", "box-shadow", "box-sizing", "break-after", "break-before", "break-inside", - "caption-side", "clear", "clip", "color", "color-profile", "column-count", + "caption-side", "caret-color", "clear", "clip", "color", "color-profile", "column-count", "column-fill", "column-gap", "column-rule", "column-rule-color", "column-rule-style", "column-rule-width", "column-span", "column-width", "columns", "content", "counter-increment", "counter-reset", "crop", "cue", @@ -493,7 +494,7 @@ CodeMirror.defineMode("css", function(config, parserConfig) { "grid-row-start", "grid-template", "grid-template-areas", "grid-template-columns", "grid-template-rows", "hanging-punctuation", "height", "hyphens", "icon", "image-orientation", "image-rendering", "image-resolution", - "inline-box-align", "justify-content", "left", "letter-spacing", + "inline-box-align", "justify-content", "justify-items", "justify-self", "left", "letter-spacing", "line-break", "line-height", "line-stacking", "line-stacking-ruby", "line-stacking-shift", "line-stacking-strategy", "list-style", "list-style-image", "list-style-position", "list-style-type", "margin", @@ -508,7 +509,7 @@ CodeMirror.defineMode("css", function(config, parserConfig) { "padding", "padding-bottom", "padding-left", "padding-right", "padding-top", "page", "page-break-after", "page-break-before", "page-break-inside", "page-policy", "pause", "pause-after", "pause-before", "perspective", - "perspective-origin", "pitch", "pitch-range", "play-during", "position", + "perspective-origin", "pitch", "pitch-range", "place-content", "place-items", "place-self", "play-during", "position", "presentation-level", "punctuation-trim", "quotes", "region-break-after", "region-break-before", "region-break-inside", "region-fragment", "rendering-intent", "resize", "rest", "rest-after", "rest-before", "richness", @@ -659,13 +660,13 @@ CodeMirror.defineMode("css", function(config, parserConfig) { "s-resize", "sans-serif", "saturation", "scale", "scale3d", "scaleX", "scaleY", "scaleZ", "screen", "scroll", "scrollbar", "scroll-position", "se-resize", "searchfield", "searchfield-cancel-button", "searchfield-decoration", - "searchfield-results-button", "searchfield-results-decoration", + "searchfield-results-button", "searchfield-results-decoration", "self-start", "self-end", "semi-condensed", "semi-expanded", "separate", "serif", "show", "sidama", "simp-chinese-formal", "simp-chinese-informal", "single", "skew", "skewX", "skewY", "skip-white-space", "slide", "slider-horizontal", "slider-vertical", "sliderthumb-horizontal", "sliderthumb-vertical", "slow", "small", "small-caps", "small-caption", "smaller", "soft-light", "solid", "somali", - "source-atop", "source-in", "source-out", "source-over", "space", "space-around", "space-between", "spell-out", "square", + "source-atop", "source-in", "source-out", "source-over", "space", "space-around", "space-between", "space-evenly", "spell-out", "square", "square-button", "start", "static", "status-bar", "stretch", "stroke", "sub", "subpixel-antialiased", "super", "sw-resize", "symbolic", "symbols", "system-ui", "table", "table-caption", "table-cell", "table-column", "table-column-group", @@ -792,7 +793,7 @@ CodeMirror.defineMode("css", function(config, parserConfig) { }, "@": function(stream) { if (stream.eat("{")) return [null, "interpolation"]; - if (stream.match(/^(charset|document|font-face|import|(-(moz|ms|o|webkit)-)?keyframes|media|namespace|page|supports)\b/, false)) return false; + if (stream.match(/^(charset|document|font-face|import|(-(moz|ms|o|webkit)-)?keyframes|media|namespace|page|supports)\b/i, false)) return false; stream.eatWhile(/[\w\\\-]/); if (stream.match(/^\s*:/, false)) return ["variable-2", "variable-definition"]; diff --git a/web/bower_components/codemirror/mode/htmlmixed/htmlmixed.js b/web/bower_components/codemirror/mode/htmlmixed/htmlmixed.js index 16b4f13..33398ec 100644 --- a/web/bower_components/codemirror/mode/htmlmixed/htmlmixed.js +++ b/web/bower_components/codemirror/mode/htmlmixed/htmlmixed.js @@ -133,11 +133,11 @@ return state.token(stream, state); }, - indent: function (state, textAfter) { + indent: function (state, textAfter, line) { if (!state.localMode || /^\s*<\//.test(textAfter)) return htmlMode.indent(state.htmlState, textAfter); else if (state.localMode.indent) - return state.localMode.indent(state.localState, textAfter); + return state.localMode.indent(state.localState, textAfter, line); else return CodeMirror.Pass; }, diff --git a/web/bower_components/codemirror/mode/javascript/javascript.js b/web/bower_components/codemirror/mode/javascript/javascript.js index 7c09476..c4a709c 100644 --- a/web/bower_components/codemirror/mode/javascript/javascript.js +++ b/web/bower_components/codemirror/mode/javascript/javascript.js @@ -11,11 +11,6 @@ })(function(CodeMirror) { "use strict"; -function expressionAllowed(stream, state, backUp) { - return /^(?:operator|sof|keyword c|case|new|export|default|[\[{}\(,;:]|=>)$/.test(state.lastType) || - (state.lastType == "quasi" && /\{\s*$/.test(stream.string.slice(0, stream.pos - (backUp || 0)))) -} - CodeMirror.defineMode("javascript", function(config, parserConfig) { var indentUnit = config.indentUnit; var statementIndent = parserConfig.statementIndent; @@ -28,53 +23,21 @@ CodeMirror.defineMode("javascript", function(config, parserConfig) { var keywords = function(){ function kw(type) {return {type: type, style: "keyword"};} - var A = kw("keyword a"), B = kw("keyword b"), C = kw("keyword c"); + var A = kw("keyword a"), B = kw("keyword b"), C = kw("keyword c"), D = kw("keyword d"); var operator = kw("operator"), atom = {type: "atom", style: "atom"}; - var jsKeywords = { + return { "if": kw("if"), "while": A, "with": A, "else": B, "do": B, "try": B, "finally": B, - "return": C, "break": C, "continue": C, "new": kw("new"), "delete": C, "throw": C, "debugger": C, - "var": kw("var"), "const": kw("var"), "let": kw("var"), + "return": D, "break": D, "continue": D, "new": kw("new"), "delete": C, "void": C, "throw": C, + "debugger": kw("debugger"), "var": kw("var"), "const": kw("var"), "let": kw("var"), "function": kw("function"), "catch": kw("catch"), "for": kw("for"), "switch": kw("switch"), "case": kw("case"), "default": kw("default"), "in": operator, "typeof": operator, "instanceof": operator, "true": atom, "false": atom, "null": atom, "undefined": atom, "NaN": atom, "Infinity": atom, "this": kw("this"), "class": kw("class"), "super": kw("atom"), "yield": C, "export": kw("export"), "import": kw("import"), "extends": C, - "await": C, "async": kw("async") + "await": C }; - - // Extend the 'normal' keywords with the TypeScript language extensions - if (isTS) { - var type = {type: "variable", style: "variable-3"}; - var tsKeywords = { - // object-like things - "interface": kw("class"), - "implements": C, - "namespace": C, - "module": kw("module"), - "enum": kw("module"), - "type": kw("type"), - - // scope modifiers - "public": kw("modifier"), - "private": kw("modifier"), - "protected": kw("modifier"), - "abstract": kw("modifier"), - - // operators - "as": operator, - - // types - "string": type, "number": type, "boolean": type, "any": type - }; - - for (var attr in tsKeywords) { - jsKeywords[attr] = tsKeywords[attr]; - } - } - - return jsKeywords; }(); var isOperatorChar = /[+\-*&%=<>!?|~^@]/; @@ -136,7 +99,7 @@ CodeMirror.defineMode("javascript", function(config, parserConfig) { stream.match(/^\b(([gimyu])(?![gimyu]*\2))+\b/); return ret("regexp", "string-2"); } else { - stream.eatWhile(isOperatorChar); + stream.eat("="); return ret("operator", "operator", stream.current()); } } else if (ch == "`") { @@ -146,14 +109,27 @@ CodeMirror.defineMode("javascript", function(config, parserConfig) { stream.skipToEnd(); return ret("error", "error"); } else if (isOperatorChar.test(ch)) { - if (ch != ">" || !state.lexical || state.lexical.type != ">") - stream.eatWhile(isOperatorChar); + if (ch != ">" || !state.lexical || state.lexical.type != ">") { + if (stream.eat("=")) { + if (ch == "!" || ch == "=") stream.eat("=") + } else if (/[<>*+\-]/.test(ch)) { + stream.eat(ch) + if (ch == ">") stream.eat(ch) + } + } return ret("operator", "operator", stream.current()); } else if (wordRE.test(ch)) { stream.eatWhile(wordRE); - var word = stream.current(), known = keywords.propertyIsEnumerable(word) && keywords[word]; - return (known && state.lastType != ".") ? ret(known.type, known.style, word) : - ret("variable", "variable", word); + var word = stream.current() + if (state.lastType != ".") { + if (keywords.propertyIsEnumerable(word)) { + var kw = keywords[word] + return ret(kw.type, kw.style, word) + } + if (word == "async" && stream.match(/^(\s|\/\*.*?\*\/)*[\[\(\w]/, false)) + return ret("async", "keyword", word) + } + return ret("variable", "variable", word) } } @@ -307,6 +283,10 @@ CodeMirror.defineMode("javascript", function(config, parserConfig) { } } + function isModifier(name) { + return name == "public" || name == "private" || name == "protected" || name == "abstract" || name == "readonly" + } + // Combinators var defaultVars = {name: "this", next: {name: "arguments"}}; @@ -352,6 +332,8 @@ CodeMirror.defineMode("javascript", function(config, parserConfig) { if (type == "var") return cont(pushlex("vardef", value.length), vardef, expect(";"), poplex); if (type == "keyword a") return cont(pushlex("form"), parenExpr, statement, poplex); if (type == "keyword b") return cont(pushlex("form"), statement, poplex); + if (type == "keyword d") return cx.stream.match(/^\s*$/, false) ? cont() : cont(pushlex("stat"), maybeexpression, expect(";"), poplex); + if (type == "debugger") return cont(expect(";")); if (type == "{") return cont(pushlex("}"), block, poplex); if (type == ";") return cont(); if (type == "if") { @@ -361,60 +343,73 @@ CodeMirror.defineMode("javascript", function(config, parserConfig) { } if (type == "function") return cont(functiondef); if (type == "for") return cont(pushlex("form"), forspec, statement, poplex); - if (type == "variable") return cont(pushlex("stat"), maybelabel); - if (type == "switch") return cont(pushlex("form"), parenExpr, pushlex("}", "switch"), expect("{"), + if (type == "class" || (isTS && value == "interface")) { cx.marked = "keyword"; return cont(pushlex("form"), className, poplex); } + if (type == "variable") { + if (isTS && value == "declare") { + cx.marked = "keyword" + return cont(statement) + } else if (isTS && (value == "module" || value == "enum" || value == "type") && cx.stream.match(/^\s*\w/, false)) { + cx.marked = "keyword" + if (value == "enum") return cont(enumdef); + else if (value == "type") return cont(typeexpr, expect("operator"), typeexpr, expect(";")); + else return cont(pushlex("form"), pattern, expect("{"), pushlex("}"), block, poplex, poplex) + } else if (isTS && value == "namespace") { + cx.marked = "keyword" + return cont(pushlex("form"), expression, block, poplex) + } else if (isTS && value == "abstract") { + cx.marked = "keyword" + return cont(statement) + } else { + return cont(pushlex("stat"), maybelabel); + } + } + if (type == "switch") return cont(pushlex("form"), parenExpr, expect("{"), pushlex("}", "switch"), block, poplex, poplex); if (type == "case") return cont(expression, expect(":")); if (type == "default") return cont(expect(":")); if (type == "catch") return cont(pushlex("form"), pushcontext, expect("("), funarg, expect(")"), statement, poplex, popcontext); - if (type == "class") return cont(pushlex("form"), className, poplex); if (type == "export") return cont(pushlex("stat"), afterExport, poplex); if (type == "import") return cont(pushlex("stat"), afterImport, poplex); - if (type == "module") return cont(pushlex("form"), pattern, pushlex("}"), expect("{"), block, poplex, poplex) - if (type == "type") return cont(typeexpr, expect("operator"), typeexpr, expect(";")); if (type == "async") return cont(statement) if (value == "@") return cont(expression, statement) return pass(pushlex("stat"), expression, expect(";"), poplex); } - function expression(type) { - return expressionInner(type, false); + function expression(type, value) { + return expressionInner(type, value, false); } - function expressionNoComma(type) { - return expressionInner(type, true); + function expressionNoComma(type, value) { + return expressionInner(type, value, true); } function parenExpr(type) { if (type != "(") return pass() return cont(pushlex(")"), expression, expect(")"), poplex) } - function expressionInner(type, noComma) { + function expressionInner(type, value, noComma) { if (cx.state.fatArrowAt == cx.stream.start) { var body = noComma ? arrowBodyNoComma : arrowBody; - if (type == "(") return cont(pushcontext, pushlex(")"), commasep(pattern, ")"), poplex, expect("=>"), body, popcontext); + if (type == "(") return cont(pushcontext, pushlex(")"), commasep(funarg, ")"), poplex, expect("=>"), body, popcontext); else if (type == "variable") return pass(pushcontext, pattern, expect("=>"), body, popcontext); } var maybeop = noComma ? maybeoperatorNoComma : maybeoperatorComma; if (atomicTypes.hasOwnProperty(type)) return cont(maybeop); if (type == "function") return cont(functiondef, maybeop); - if (type == "class") return cont(pushlex("form"), classExpression, poplex); - if (type == "keyword c" || type == "async") return cont(noComma ? maybeexpressionNoComma : maybeexpression); + if (type == "class" || (isTS && value == "interface")) { cx.marked = "keyword"; return cont(pushlex("form"), classExpression, poplex); } + if (type == "keyword c" || type == "async") return cont(noComma ? expressionNoComma : expression); if (type == "(") return cont(pushlex(")"), maybeexpression, expect(")"), poplex, maybeop); if (type == "operator" || type == "spread") return cont(noComma ? expressionNoComma : expression); if (type == "[") return cont(pushlex("]"), arrayLiteral, poplex, maybeop); if (type == "{") return contCommasep(objprop, "}", null, maybeop); if (type == "quasi") return pass(quasi, maybeop); if (type == "new") return cont(maybeTarget(noComma)); + if (type == "import") return cont(expression); return cont(); } function maybeexpression(type) { if (type.match(/[;\}\)\],]/)) return pass(); return pass(expression); } - function maybeexpressionNoComma(type) { - if (type.match(/[;\}\)\],]/)) return pass(); - return pass(expressionNoComma); - } function maybeoperatorComma(type, value) { if (type == ",") return cont(expression); @@ -425,7 +420,9 @@ CodeMirror.defineMode("javascript", function(config, parserConfig) { var expr = noComma == false ? expression : expressionNoComma; if (type == "=>") return cont(pushcontext, noComma ? arrowBodyNoComma : arrowBody, popcontext); if (type == "operator") { - if (/\+\+|--/.test(value)) return cont(me); + if (/\+\+|--/.test(value) || isTS && value == "!") return cont(me); + if (isTS && value == "<" && cx.stream.match(/^([^>]|<.*?>)*>\s*\(/, false)) + return cont(pushlex(">"), commasep(typeexpr, ">"), poplex, me); if (value == "?") return cont(expression, expect(":"), expr); return cont(expr); } @@ -434,6 +431,12 @@ CodeMirror.defineMode("javascript", function(config, parserConfig) { if (type == "(") return contCommasep(expressionNoComma, ")", "call", me); if (type == ".") return cont(property, me); if (type == "[") return cont(pushlex("]"), maybeexpression, expect("]"), poplex, me); + if (isTS && value == "as") { cx.marked = "keyword"; return cont(typeexpr, me) } + if (type == "regexp") { + cx.state.lastType = cx.marked = "operator" + cx.stream.backUp(cx.stream.pos - cx.stream.start - 1) + return cont(expr) + } } function quasi(type, value) { if (type != "quasi") return pass(); @@ -458,6 +461,7 @@ CodeMirror.defineMode("javascript", function(config, parserConfig) { function maybeTarget(noComma) { return function(type) { if (type == ".") return cont(noComma ? targetNoComma : target); + else if (type == "variable" && isTS) return cont(maybeTypeArgs, noComma ? maybeoperatorNoComma : maybeoperatorComma) else return pass(noComma ? expressionNoComma : expression); }; } @@ -481,18 +485,25 @@ CodeMirror.defineMode("javascript", function(config, parserConfig) { } else if (type == "variable" || cx.style == "keyword") { cx.marked = "property"; if (value == "get" || value == "set") return cont(getterSetter); + var m // Work around fat-arrow-detection complication for detecting typescript typed arrow params + if (isTS && cx.state.fatArrowAt == cx.stream.start && (m = cx.stream.match(/^\s*:\s*/, false))) + cx.state.fatArrowAt = cx.stream.pos + m[0].length return cont(afterprop); } else if (type == "number" || type == "string") { cx.marked = jsonldMode ? "property" : (cx.style + " property"); return cont(afterprop); } else if (type == "jsonld-keyword") { return cont(afterprop); - } else if (type == "modifier") { + } else if (isTS && isModifier(value)) { + cx.marked = "keyword" return cont(objprop) } else if (type == "[") { - return cont(expression, expect("]"), afterprop); + return cont(expression, maybetype, expect("]"), afterprop); } else if (type == "spread") { - return cont(expression); + return cont(expressionNoComma, afterprop); + } else if (value == "*") { + cx.marked = "keyword"; + return cont(objprop); } else if (type == ":") { return pass(afterprop) } @@ -539,11 +550,32 @@ CodeMirror.defineMode("javascript", function(config, parserConfig) { if (value == "?") return cont(maybetype); } } - function typeexpr(type) { - if (type == "variable") {cx.marked = "variable-3"; return cont(afterType);} + function mayberettype(type) { + if (isTS && type == ":") { + if (cx.stream.match(/^\s*\w+\s+is\b/, false)) return cont(expression, isKW, typeexpr) + else return cont(typeexpr) + } + } + function isKW(_, value) { + if (value == "is") { + cx.marked = "keyword" + return cont() + } + } + function typeexpr(type, value) { + if (value == "keyof" || value == "typeof") { + cx.marked = "keyword" + return cont(value == "keyof" ? typeexpr : expressionNoComma) + } + if (type == "variable" || value == "void") { + cx.marked = "type" + return cont(afterType) + } if (type == "string" || type == "number" || type == "atom") return cont(afterType); - if (type == "{") return cont(pushlex("}"), commasep(typeprop, "}", ",;"), poplex) + if (type == "[") return cont(pushlex("]"), commasep(typeexpr, "]", ","), poplex, afterType) + if (type == "{") return cont(pushlex("}"), commasep(typeprop, "}", ",;"), poplex, afterType) if (type == "(") return cont(commasep(typearg, ")"), maybeReturnType) + if (type == "<") return cont(commasep(typeexpr, ">"), typeexpr) } function maybeReturnType(type) { if (type == "=>") return cont(typeexpr) @@ -556,22 +588,36 @@ CodeMirror.defineMode("javascript", function(config, parserConfig) { return cont(typeprop) } else if (type == ":") { return cont(typeexpr) + } else if (type == "[") { + return cont(expression, maybetype, expect("]"), typeprop) } } - function typearg(type) { - if (type == "variable") return cont(typearg) - else if (type == ":") return cont(typeexpr) + function typearg(type, value) { + if (type == "variable" && cx.stream.match(/^\s*[?:]/, false) || value == "?") return cont(typearg) + if (type == ":") return cont(typeexpr) + return pass(typeexpr) } function afterType(type, value) { if (value == "<") return cont(pushlex(">"), commasep(typeexpr, ">"), poplex, afterType) - if (value == "|" || type == ".") return cont(typeexpr) + if (value == "|" || type == "." || value == "&") return cont(typeexpr) if (type == "[") return cont(expect("]"), afterType) + if (value == "extends" || value == "implements") { cx.marked = "keyword"; return cont(typeexpr) } + } + function maybeTypeArgs(_, value) { + if (value == "<") return cont(pushlex(">"), commasep(typeexpr, ">"), poplex, afterType) + } + function typeparam() { + return pass(typeexpr, maybeTypeDefault) + } + function maybeTypeDefault(_, value) { + if (value == "=") return cont(typeexpr) } - function vardef() { + function vardef(_, value) { + if (value == "enum") {cx.marked = "keyword"; return cont(enumdef)} return pass(pattern, maybetype, maybeAssign, vardefCont); } function pattern(type, value) { - if (type == "modifier") return cont(pattern) + if (isTS && isModifier(value)) { cx.marked = "keyword"; return cont(pattern) } if (type == "variable") { register(value); return cont(); } if (type == "spread") return cont(pattern); if (type == "[") return contCommasep(pattern, "]"); @@ -596,7 +642,8 @@ CodeMirror.defineMode("javascript", function(config, parserConfig) { function maybeelse(type, value) { if (type == "keyword b" && value == "else") return cont(pushlex("form", "else"), statement, poplex); } - function forspec(type) { + function forspec(type, value) { + if (value == "await") return cont(forspec); if (type == "(") return cont(pushlex(")"), forspec1, expect(")"), poplex); } function forspec1(type) { @@ -620,11 +667,13 @@ CodeMirror.defineMode("javascript", function(config, parserConfig) { function functiondef(type, value) { if (value == "*") {cx.marked = "keyword"; return cont(functiondef);} if (type == "variable") {register(value); return cont(functiondef);} - if (type == "(") return cont(pushcontext, pushlex(")"), commasep(funarg, ")"), poplex, maybetype, statement, popcontext); - if (isTS && value == "<") return cont(pushlex(">"), commasep(typeexpr, ">"), poplex, functiondef) + if (type == "(") return cont(pushcontext, pushlex(")"), commasep(funarg, ")"), poplex, mayberettype, statement, popcontext); + if (isTS && value == "<") return cont(pushlex(">"), commasep(typeparam, ">"), poplex, functiondef) } - function funarg(type) { + function funarg(type, value) { + if (value == "@") cont(expression, funarg) if (type == "spread") return cont(funarg); + if (isTS && isModifier(value)) { cx.marked = "keyword"; return cont(funarg); } return pass(pattern, maybetype, maybeAssign); } function classExpression(type, value) { @@ -636,24 +685,27 @@ CodeMirror.defineMode("javascript", function(config, parserConfig) { if (type == "variable") {register(value); return cont(classNameAfter);} } function classNameAfter(type, value) { - if (value == "<") return cont(pushlex(">"), commasep(typeexpr, ">"), poplex, classNameAfter) - if (value == "extends" || value == "implements" || (isTS && type == ",")) + if (value == "<") return cont(pushlex(">"), commasep(typeparam, ">"), poplex, classNameAfter) + if (value == "extends" || value == "implements" || (isTS && type == ",")) { + if (value == "implements") cx.marked = "keyword"; return cont(isTS ? typeexpr : expression, classNameAfter); + } if (type == "{") return cont(pushlex("}"), classBody, poplex); } function classBody(type, value) { + if (type == "async" || + (type == "variable" && + (value == "static" || value == "get" || value == "set" || (isTS && isModifier(value))) && + cx.stream.match(/^\s+[\w$\xa1-\uffff]/, false))) { + cx.marked = "keyword"; + return cont(classBody); + } if (type == "variable" || cx.style == "keyword") { - if ((value == "async" || value == "static" || value == "get" || value == "set" || - (isTS && (value == "public" || value == "private" || value == "protected" || value == "readonly" || value == "abstract"))) && - cx.stream.match(/^\s+[\w$\xa1-\uffff]/, false)) { - cx.marked = "keyword"; - return cont(classBody); - } cx.marked = "property"; return cont(isTS ? classfield : functiondef, classBody); } if (type == "[") - return cont(expression, expect("]"), isTS ? classfield : functiondef, classBody) + return cont(expression, maybetype, expect("]"), isTS ? classfield : functiondef, classBody) if (value == "*") { cx.marked = "keyword"; return cont(classBody); @@ -680,6 +732,7 @@ CodeMirror.defineMode("javascript", function(config, parserConfig) { } function afterImport(type) { if (type == "string") return cont(); + if (type == "(") return pass(expression); return pass(importSpec, maybeMoreImports, maybeFrom); } function importSpec(type, value) { @@ -701,6 +754,12 @@ CodeMirror.defineMode("javascript", function(config, parserConfig) { if (type == "]") return cont(); return pass(commasep(expressionNoComma, "]")); } + function enumdef() { + return pass(pushlex("form"), pattern, expect("{"), pushlex("}"), commasep(enummember, "}"), poplex, poplex) + } + function enummember() { + return pass(pattern, maybeAssign); + } function isContinuedStatement(state, textAfter) { return state.lastType == "operator" || state.lastType == "," || @@ -708,6 +767,12 @@ CodeMirror.defineMode("javascript", function(config, parserConfig) { /[,.]/.test(textAfter.charAt(0)); } + function expressionAllowed(stream, state, backUp) { + return state.tokenize == tokenBase && + /^(?:operator|sof|keyword [bcd]|case|new|export|default|spread|[\[{}\(,;:]|=>)$/.test(state.lastType) || + (state.lastType == "quasi" && /\{\s*$/.test(stream.string.slice(0, stream.pos - (backUp || 0)))) + } + // Interface return { @@ -773,6 +838,7 @@ CodeMirror.defineMode("javascript", function(config, parserConfig) { electricInput: /^\s*(?:case .*?:|default:|\{|\})$/, blockCommentStart: jsonMode ? null : "/*", blockCommentEnd: jsonMode ? null : "*/", + blockCommentContinue: jsonMode ? null : " * ", lineComment: jsonMode ? null : "//", fold: "brace", closeBrackets: "()[]{}''\"\"``", @@ -782,6 +848,7 @@ CodeMirror.defineMode("javascript", function(config, parserConfig) { jsonMode: jsonMode, expressionAllowed: expressionAllowed, + skipExpression: function(state) { var top = state.cc[state.cc.length - 1] if (top == expression || top == expressionNoComma) state.cc.pop() diff --git a/web/help/about.html b/web/help/about.html index 135ed18..3619b9d 100644 --- a/web/help/about.html +++ b/web/help/about.html @@ -56,7 +56,9 @@ style="color:maroon">SH</span> is a great tool for teaching Prolog. We provide a <a href="http://lpn.swi-prolog.org">prototype of Learn Prolog Now!</a> where <span style="color:darkblue">SWI</span><span style="color:maroon">SH</span> is embedded to run examples and solve -excercises from within your browser. +excercises from within your browser. Peter Flach prepared his book +<a href="http://book.simply-logical.space/">Simply Logical</a> for +<span style="color:darkblue">SWI</span><span style="color:maroon">SH</span>. </p> <p> @@ -65,6 +67,7 @@ style="color:maroon">SH</span> source is available from <a href="http://github.com/SWI-Prolog/swish.git">Github</a>. It is under heavy development and often requires SWI-Prolog 7 installed from the latest <a href="http://github.com/SWI-Prolog/swipl-devel.git">GIT</a>. +We also provide a <a href="https://hub.docker.com/r/swipl/swish/">Docker image</a>. </p> <p> diff --git a/web/help/sourcelist.html b/web/help/sourcelist.html new file mode 100644 index 0000000..f1ff791 --- /dev/null +++ b/web/help/sourcelist.html @@ -0,0 +1,61 @@ +<!DOCTYPE HTML> + +<html> + <head> + <title>Finding files</title> + <style> + dl.help-sourcelist dd { margin-left: 2em; } + </style> + </head> +<body> + +<p> +The file overview allows you to find and open files. The <b>Filter</b> +and <b>Type</b> menus can be used to pre-fill the query input field. +The input field itself uses the following syntax: + + <dl class="help-sourcelist"> + <dt><code>type:</code><var>pl | swinb | lnk</var> + <dd>Restrict the search to documents of the indicated type. + </dd> + <dt><code>user:</code><var>Pattern</var> + <dd>Restrict the search to documents attributed to a user matching + <var>Pattern</var>. The <var>Pattern</var> <code>me</code> restricts + the search to your documents. If you are logged in, these are documents + created when you were logged in. Otherwise ownership is heuristically + determined based on the <i>avatar</i> and <i>nickname</i> + </dd> + <dt><code>name:</code><var>Pattern</var> + <dd>Restrict the search to files whose name match <var>Pattern</var> + </dd> + <dt><code>tag:</code><var>Pattern</var> + <dd>Restrict the search to files with a tag matching <var>Pattern</var> + </dd> + <dt>"String" + <dd>Search for files containing <var>String</var>. Match is case + insensitive. + </dd> + <dt>/Regex/[flags] + <dd>Search for files whose content matches <var>Regex</var>. Flags + is a sequence of the following flags: <b>i</b> (ignore case), <b>x</b> + (Perl extended regular expressions), <b>m</b> (multiple lines) + <b>s</b> (Perl <i>dotall</i> mode). + </dd> + <dt><i>Word</i> + <dd>Search for files containing <var>Word</var>. Match is case + insensitive. + </dd> + </dl> + +<p> +The <code>type:</code>, <code>user:</code>, <code>name:</code> and +<code>tag:</code> syntax may be followed by a sequence of non-blanks, a +double quoted string or a regular expression written as +<b>/Regex/flags</b>. + +<p> +Search is executed after 2 seconds after a modification of the search field, +on hitting <b>RET</b> or clicking the search button. + +</body> +</html> diff --git a/web/js/require.js b/web/js/require.js index 0fc1082..051e284 100644 --- a/web/js/require.js +++ b/web/js/require.js @@ -1,5 +1,5 @@ /** vim: et:ts=4:sw=4:sts=4 - * @license RequireJS 2.3.3 Copyright jQuery Foundation and other contributors. + * @license RequireJS 2.3.5 Copyright jQuery Foundation and other contributors. * Released under MIT license, https://github.com/requirejs/requirejs/blob/master/LICENSE */ //Not using strict: uneven strict support in browsers, #392, and causes @@ -11,7 +11,7 @@ var requirejs, require, define; (function (global, setTimeout) { var req, s, head, baseElement, dataMain, src, interactiveScript, currentlyAddingScript, mainScript, subPath, - version = '2.3.3', + version = '2.3.5', commentRegExp = /\/\*[\s\S]*?\*\/|([^:"'=]|^)\/\/.*$/mg, cjsRequireRegExp = /[^.]\s*require\s*\(\s*["']([^'"\s]+)["']\s*\)/g, jsSuffixRegExp = /\.js$/,