Saturday, April 06, 2024

Python 2.7 -> 3.8 on App Engine. Ordered checklist with an unfolding stub.

The context here is a kind of "resource pump" webapp, a common enough migration of early static web sites, during the mid-web-2.0-era (say 2008), to Python 2 with WSGI on Google App Engine. The simple python program does very little, allowing the app.yaml file to define the work, serving folders full of static files.

So, how do we move these sites to App Engine with Python 3.8 and Flask 2? Here's an ordered checklist, that is, a set of instructions with ordered dependencies. That's also typical of unfolding sequences in software, but in the latter, there tends to be more creativity and judgment involved ... and the steps are not instructions ... instead they're helpful issues to consider at that moment. Most of the steps below simply need to happen, in that order. So they are instructions. There's only one somewhat creative step here ... yet it highlights the point where more of them might be written and inserted, to serve a wider range of migrations.


  1. git commit -a -m 'starting migration from python 2.7'

  2. mkdir templates

  3. mv index.html templates

  4. git add templates/index.html

  5. add requirements.txt and git add requirements.txt

  6. add main.py (if there's a route table in the WSGI version, move the routes to flask. This is the one creative task. It's kind of a stub: this is where all the creative tasks in a sequence would go, to serve a greater range of programs.) and git add main.py

  7. change the app.yaml head

  8. change the app.yaml tail and git commit -m 'first python 3.8 changes'

  9. (if it makes sense, set up the local test environment)

  10. (if it makes sense, run gunicorn & test) Note that gunicorn does not use app.yaml, so your mileage may vary, in using this local test environment. If you don't need to debug the server-side python, see deployment test steps 12-14

  11. (if you created a virtual environment, git add <project env> directory to .gcloudignore)

  12. gcloud app deploy --project <migrating site> --no-promote

  13. Go to (cloud console->app engine -> versions), find the new version, launch and test

  14. Is the test good? Select the new version and click “migrate traffic”.

If you want to setup a local test environment (again, useful if there's more server code to test):

virtualenv -p python3.8.2 <project env>

source ./<project env>/bin/activate

pip install -r requirements.txt

(or

pip install gunicorn

pip install flask

pip install google-cloud-datastore

pip list

)

gunicorn -b :8080 main:app

(test in browser at localhost:8080)

^c

deactivate


old app.yaml head:

runtime: python27

api_version: 1

threadsafe: false


new app.yaml head:

runtime: python38

app_engine_apis: true


old app.yaml tail:

- url: /.*

  script: <migrating site>.app

  secure: always

  redirect_http_response_code: 301


new app.yaml tail:

- url: /.*

  script: auto

  secure: always

  redirect_http_response_code: 301


new requirements.txt:

Flask==2.2.2

google-cloud-datastore==2.7.0

appengine-python-standard>=1.0.0

google-auth==2.17.1

google-auth-oauthlib==1.0.0

google-auth-httplib2==0.1.0

werkzeug==2.2.2

And here's the one creative step in this checklist's sequence: migrating the route table. It's only a stub for further creative-and-judged unfolding steps, if one is migrating server-side application logic:

new main.py:

from flask import Flask, render_template, request

app = Flask(__name__)


@app.route('/')

@app.route('/endpoin_one')

@app.route('/endpoint_two')

def root():

    # NB: index.html must be in /templates

    return render_template('index.html')

if __name__ == '__main__':

    app.run()


old <migrating site>.py:

# universal index.html delivery

# in python27 as a service

# on Google App Engine

import cgi

import os

import webapp2

from google.appengine.ext.webapp import template

from google.appengine.api import users

class MainPage(webapp2.RequestHandler):

    def get(self):

        template_values = {

            }

        path = os.path.join(os.path.dirname(__file__), 'index.html')

        self.response.out.write(template.render(path, template_values))

app = webapp2.WSGIApplication(

                                     [('/', MainPage)

                                     ,('/endpoint_one',MainPage)

                                     ,('/endpoint_two',MainPage)

                                      ],

                                     debug=True)


Friday, March 15, 2024

Broken or erratic or unreliable visual editor (emacs, vim, etc.) over ssh in Mac terminal?

This is a rather specific problem. 

But I couldn't find mention of it anywhere.

The MacOS terminal implementation has difficulty when the window, or more precisely the amount of data stored in the terminal process, gets very large ... 

... say you've been using it for days, and lots of output has scrolled up, but you haven't opened a new terminal window ... maybe because you want to look at what you've done already. You could export it, but then you'd have to think about where to put that exported data. 

Mac terminal tends to be a bit greedy of RAM, and if you have lots of these terminal windows, managing lots of projects, you may see some performance degradation.

... but, also, you may see some actual problems.

For example, if one of these terminals is connected to a remote host using ssh, and you start to use a visual editor (say, emacs, vi, or vim) on the remote machine, the editor might start to make errors, and become essentially unusable. The remote editor has expectations for terminal text control signals, and Mac terminal, unable to work fast enough or buffer signals reliably, because of the large volume of text in the current process, simply fails to keep sync.

So, you're not crazy. Not everything in computing is deterministic, especially when networking is involved. (And yes, Apple could and should fix this problem.)

For now, simply close the remote host, export the terminal text, close the terminal, and open a new one.


Wednesday, November 22, 2023

Fixing software updates by playing with hardware

The "Epson Perfection V600 Photo" is a fine scanner, but the software often fails it, and the user.

This is true after all the most recent updates have been applied, regardless of which scanning app is used.

The error is:

"Unable to send data. Check the connection to the scanner and try again. E583-B318"

It reports this error even while visibly communicating with the scanner, with various whirring and clicking as evidence.

Not so helpfully, the error prevents scanning from taking place, after which the scanning app terminates.

With a bit of oddball tinkering, I found a "fix". I'll use the transparency scanner as an example.

If you open the lid of the scanner, and try to scan, you get this error:

"Remove the document mat from the scanner."

But this is not a fatal error. Close the scanner lid, click "ok", and try your scan.

For me (hopefully others) the scanning then works continually, until I unplug the computer from the scanner, or quit the scanning app.

I'll let the reader draw their own conclusions about the level of investment Epson makes to assure software quality ...

Friday, October 20, 2023

gmail broken with arbitrary, invisible, forced short autowrap

 ... or that's how it looked to me. 

Gmail compose took a long input line, which normally would autowrap within whatever box it was viewed, but instead created a hard, short autowrap, in the background, where I could not see it, and did not want it.

Obviously I'd accidental changed a setting. But I scoured the settings, and couldn't see an option that fit the problem. 

There's another set of settings, weirdly not referenced or linked in the main settings. These are in the compose window. And they don't apply until the next time you open the compose window.

Those settings are under the three dots to the right of the text tools, and the culprit was "plain text mode".

Now as someone who used email decades before there was email formatting, I was a little irked by the assertion implied by this setting's name.

If it was 'plain text' why not just treat the input the way it will be received? Why create an arbitrarily short autowrap of the input text, which will alway look wrong? This is because it's not previewed, that is, it's not WYSIWYG. It turns a potentially useful option into one that would only be useful for sending emails to very primitive small-screen devices, with no option to use plain text in a way that's under control of the sender. 

So, a broken UX in gmail. Which usually is more careful about its features.



Sunday, October 01, 2023

Werkzeug ... and who is responsible for code stability?

Don't you love it when you haven't deployed for a few days, and you change something insignificant, and then your deployed app crashes, because of something far outside your purview?

The python Werkzeug WSGI library was just updated to 3.0. This caused Python Flask 2.2.2 web apps running on Google Cloud's App Engine to automatically update. Which, if you use one of the utilities, url_quote, you get this error:

ImportError: cannot import name 'url_quote' from 'werkzeug.urls'

So, yes, I might have caught this by updating and running it first in my local environment. But Google Cloud could have caught this too, creating a stable environment for incremental deployment.

The fix is to add this line to your requirements.txt file:

werkzeug==2.2.2

This reminds me of the whole unnecessary forced move to Flask, with its mixed bag of improvements and problems. It should be possible to run a webapp of any age (at least with configurations since 2008) in Google App Engine. Why all the unnecessary updating, crashing, and subsequent compulsory code obsolescence? What happened to backwards compatability? If it was still an observed principle, it would be easier now than ever. Why the insistence on forcing programmers to chase after the latest thing?


Saturday, June 17, 2023

Error messages, Google Maps, PinElement, AdvancedMarkerElement, MapId

I often try using a new feature by adding the most local code possible, that is, right where I need it. It fails, of course. The UX for developers is very rarely a priority. When it fails, I then fix one problem at a time, step-by-step, slowly getting the feature to do its job, by working my way outwards from that first bit of code, and seeing what kinds of errors I get. I want to see whether the errors point me to the next-most general problem or, instead, obscure what's actually going on. As everyone knows, it's usually the latter.
I have a hope that someday error messages will become genuinely useful, and not just strings that we need to search for in Google, in the hope that someone like me has explained the error message, and how to make the necessary repairs.
In this case, quite a lot of time could have been saved if the Google Maps API (which is loaded) had simply caught the exceptions and told me "do (1), then (2), then (3) ..." 
Ah well. Developer advocates clearly have no power inside such giant technocracies.
The goal: I wanted a custom marker. The 'free' way to do this is by adding an 'image' name-value pair in a marker:
const bluepin = "/bluepin.png";
...
marker = new google.maps.Marker({position,map,icon: bluepin});

But there's a new set of advanced marker features. The documentation didn't match my use case (my markers are created in a listener), so I started with:

const pinBackgroundBlue = new PinElement(
    {background: "blue"});
    ...
marker = new AdvancedMarkerElement(
    {map,position: position,
     content: pinBackgroundBlue.element});


... and in the console, Javascript told me that pinElement was not found. Fair enough.

In the head element I added an import of the appropriate Google Maps libraries:

<script async defer src="https://maps.googleapis.com/maps/api/js
key=YOUR_GOOGLE_MAPS_KEY
&v=beta
&libraries=visualization,marker
&callback=initMap">
</script>

Now, in the function initMap itself, I didn't change the map declaration. 
The map is global, however.
In the listener, I added the recommended import mappings:
// Request needed libraries.
const { Map } = await google.maps.importLibrary("maps");
const { AdvancedMarkerElement, PinElement } = await google.maps.importLibrary(
"marker"
);

But since this was a listener, javascript gave me an error. The enclosing function wasn't async, so it could not await.
So I did this, which worked:
const { Map } = google.maps.importLibrary("maps");
const { AdvancedMarkerElement, PinElement } = google.maps.importLibrary("marker");
Now it didn't complain about await  / async. 
But still: "PinElement not found".
Maybe the namespace is still isolated? I really don't know how much "importLibrary" is supposed to do. So I tried:
const pinBackgroundBlue = new window.google.maps.marker.PinElement(
{background: "blue"});


Now it found PinElement!

But it said "PinElement is not a constructor".

Well ... that's just silly. Of course it is.

So, I read the documentation, watched the Google Maps videos ...

And the only thing that could be missing was something that was new to me:

A "MapId".

This is for elite maps, I suppose, since Google requires that the privilege of a generated MapId is added to your billing account. I don't know what the charge is now, or later.

But if you just want to try it, add the following to the name-value pairs in your map declaration:

mapId: 'DEMO_MAP_ID'

And now PinElement is a constructor! And your custom pin will appear on the map.

Sunday, April 23, 2023

Deleting a Google Cloud Platform domain mapping under Google App Engine

Google's current documentation on this topic doesn't match the actual user interface, at least for Google App Engine custom domain mappings.

Say you want to free up a domain that you've used on a different project on GCP or GAE.

When you try to use the domain, you get an error like:

error: [domain] is already mapped to a project

To solve this, use the command line (assuming you have the developer tools and so use gcloud regularly to deploy your apps) and do the following:

gcloud auth login

This will open a browser window for you to sign in with your Google account.

Set the project in the gcloud tool to the old project where the domain is currently mapped:

gcloud config set project PROJECT_ID

Replace PROJECT_ID with the actual Project ID of the old project.

List the domain mappings for the old project:

gcloud app domain-mappings list

Locate the domains you want to remove from the list, and then run the following command to delete the domain mapping:

gcloud app domain-mappings delete DOMAIN_NAME

... replacing DOMAIN_NAME with the actual domain you want to remove.

After a bit of waiting, you'll be able use the domain again in a custom mapping.

Thursday, April 20, 2023

Simplest Google App Engine Python 3.8 Flask web app with Google Identity

Tested as of this post, with Google's latest approach to sign-in ... as far as I know. With the latest python 3.8 and flask resources (as vanilla as possible ... as far as I know). 
The proper sequence (skip anything you've done already): 
1. sign up for Google App Engine
2. put down a credit card
3. download developer tools
4. create an app
5. enable all the APIs and Services you plan to use 
   (e.g., Firestore's Datastore interface, in the case below).
6. go to API Credentials on Google Cloud Console
7. add an OAuth 2.0 Client ID
8. authorize your app for javascript origins (you'll use it eventually) but most importantly for this app, authorize the redirect URIs for: 
https://app project name.appspot.com
https://app project name.appspot.com/oauth2callback
(and any domains you map to that application)
9. Everything in red on this post needs to be replaced with your own values.

This app loads index.html when not signed in, offers a sign-in link, and when the user has signed in with google, index.html includes a home page with the user info we have access to.

app.yaml


runtime: python38
app_engine_apis: true

env_variables:
CLIENT_ID: 'your client ID copied from your google cloud console'
CLIENT_SECRET: 'your client secret copied from your google cloud console'
SECRET_KEY: "the super secret key you make up"

instance_class: F1

requirements.txt

Flask==2.2.2
google-cloud-datastore==2.7.0
appengine-python-standard>=1.0.0
google-auth==2.17.1
google-auth-oauthlib==1.0.0
google-auth-httplib2==0.1.0
werkzeug==2.2.2

main.py

import os
import requests
from flask import Flask, render_template, request, redirect, url_for, session
from google.oauth2.credentials import Credentials
from google_auth_oauthlib.flow import Flow
from google.cloud import datastore

app = Flask(__name__)

SECRET_KEY = os.environ.get("SECRET_KEY", "the super secret key you make up")
app.secret_key = SECRET_KEY

CLIENT_ID = os.environ.get('CLIENT_ID')
CLIENT_SECRET = os.environ.get('CLIENT_SECRET')
REDIRECT_URI = "https://app project name.appspot.com/oauth2callback"

@app.route('/')
def index():
if 'userinfo' in session:
return render_template("home.html", userinfo=session['userinfo'])
return render_template("index.html")

@app.route('/login')
def login():
flow = Flow.from_client_config(
{
"web": {
"client_id": CLIENT_ID,
"client_secret": CLIENT_SECRET,
"redirect_uris": [REDIRECT_URI],
"auth_uri": "https://accounts.google.com/o/oauth2/auth",
"token_uri": "https://accounts.google.com/o/oauth2/token",
}
},
scopes=[
"https://www.googleapis.com/auth/userinfo.email",
"https://www.googleapis.com/auth/userinfo.profile",
"openid",
],
)
flow.redirect_uri = REDIRECT_URI
authorization_url, _ = flow.authorization_url(prompt="consent")
return redirect(authorization_url)

@app.route('/sign_out')
 def sign_out():
    session.pop('userinfo', None)
    return redirect(url_for('index'))

@app.route('/oauth2callback')
def oauth2callback():
flow = Flow.from_client_config(
{
"web": {
"client_id": CLIENT_ID,
"client_secret": CLIENT_SECRET,
"redirect_uris": [REDIRECT_URI],
"auth_uri": "https://accounts.google.com/o/oauth2/auth",
"token_uri": "https://accounts.google.com/o/oauth2/token",
}
},
scopes=[
"https://www.googleapis.com/auth/userinfo.email",
"https://www.googleapis.com/auth/userinfo.profile",
"openid",
],
state=request.args.get("state"),
)
flow.redirect_uri = REDIRECT_URI
flow.fetch_token(code=request.args.get("code"))
credentials = flow.credentials
userinfo = get_user_info(credentials)
session['userinfo'] = userinfo
return redirect(url_for('index'))

def get_user_info(credentials):
headers = {
"Authorization": f"Bearer {credentials.token}"
}
response = requests.get("https://www.googleapis.com/oauth2/v2/userinfo", headers=headers)
userinfo = response.json()
return userinfo

if __name__ == "__main__":
app.run(debug=True)


templates/home.html

<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>your home page title</title>
</head>
<body>
<h1>Welcome, {{ userinfo['name'] }}</h1>
<h2>Email: {{ userinfo['email'] }}</h2>
<img src="{{ userinfo['picture'] }}" alt="Profile picture" width="100" height="100">
<a href="{{ url_for('sign_out') }}">Sign Out</a>
</body>
</html>

templates/index.html

<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>application title</title>
<script src="https://apis.google.com/js/platform.js" async defer></script>
<meta name="google-signin-client_id" content="{{ CLIENT_ID }}">
</head>
<body>
<h1>Welcome to this web app</h1>
<a href="{{ url_for('login') }}">Sign in with Google</a>
</body>
</html>

Thursday, January 26, 2023

Datastore preservation: migrating Python 2.7 to Python 3 on Google App Engine

Let's say you deploy a production web application on Google App Engine, using a Python 2.7 runtime. That's getting quite old today, and there are increasing numbers of incompatibilities you need to cope with as a result. What to do? 

It might seem daunting, and risky, to migrate to the rather different development environment of Google App Engine with the runtime of Python 3.x.

But if done properly, one worry, or risk, can be avoided. The data in datastore. If you have a great deal of it, it will be preserved during this migration, without the complications of an ETL process. If you translate your environment and webapp properly, the data doesn't go anywhere.

See for yourself.

Here are two webapp sequences, for the same simple three-tier application. 

The first is a Python 2.7 application. 

The second is a Python 3.x application. 

It's worth creating this project youself, so you can feel confident in the migration.

After this, when working on the migration of your production application, it's important to take precautions -- such as duplication of the codebase, and copying the datastore instance -- to allow rollback.

But you probably won't need it.

--------------

Sequence 1

A simple three-tier application on Google App Engine with the webapp2 framework and the python 2.7 runtime. We assume you've created your project and enabled billing, in your Google Cloud Console.

create index.html

<html>

<head>

<title><project id></title>

</head>

<body>

<div style="color:white;font-weight:bold;size:20px;">

<project id><br/><br/>

Visits {{visits}}

</div>

</body>

</html>


create <project id>.py

import cgi

import os

import webapp2

from google.appengine.ext.webapp import template

from google.appengine.api import users

from google.appengine.ext import db


class Visit(db.Model):

   visitor   = db.StringProperty()

   timestamp = db.DateTimeProperty(auto_now_add=True)


def store_visit(remote_addr, user_agent):

        Visit(visitor='{}: {}'.format(remote_addr, user_agent)).put()


def fetch_visits():

        return str(Visit.all().count())

    

class MainPage(webapp2.RequestHandler):

    def get(self):

        store_visit(self.request.remote_addr, self.request.user_agent)

        visits = fetch_visits();

template_values = { 'visits':visits}

path = os.path.join(os.path.dirname(__file__), 'index.html')

self.response.out.write(template.render(path, template_values))



app = webapp2.WSGIApplication(

          [('/', MainPage)],

            debug=True)


create app.yaml

runtime: python27

api_version: 1

threadsafe: false

handlers:

- url: /.*

 script: <project id>.app

 secure: always

 redirect_http_response_code: 301


run "python --version"

Python 2.7.4


test your app locally with:

dev_appserver.py --log_level debug .


deploy with:

gcloud app deploy --project <project id>

  

Visit the live page a few times, and find your datastore entities from Google Cloud Console.

---------------------------------------------

---------------------------------------------

  

Sequence 2

A simple three-tier application on Google App Engine with the flask framework and the python 3.8 runtime.

On your machine:

virtualenv -p python3.8.2 <project env>

source ./<project env>/bin/activate

pip install gunicorn

pip install flask

pip install google-cloud-datastore

        pip list

Create from the list results, include these two versions in a file called "requirements.txt":

Flask==2.2.2

google-cloud-datastore==2.13.2

  

[if you do this repeatedly, in different local virtual environments, you end up running "pip install -r requirements.txt" often]

create "app.yaml"

runtime: python38

  

create "main.py"

from datetime import datetime, timezone

import json

import time

import google.auth

from flask import Flask, render_template, request

from google.cloud import datastore


app = Flask(__name__)

ds_client = datastore.Client()


def store_visit(remote_addr, user_agent):

   entity = datastore.Entity(key=ds_client.key('Visit'))

   entity.update({

        'timestamp': datetime.now(timezone.utc),

        'visitor': '{}: {}'.format(remote_addr, user_agent),

   })

   ds_client.put(entity)



def fetch_visits():

        'get total visits'

        query = ds_client.query(kind="Visit")

        return len(list(query.fetch()))


@app.route('/')

def root():

    store_visit(request.remote_addr, request.user_agent)

    visits = fetch_visits()


    # NB: index.html must be in /templates

    return render_template('index.html',visits=visits)


if __name__ == '__main__':

    app.run()

 

create "templates/index.html":

<!doctype html>

<html>

<head>

<title><project id></title>

</head>

<body>

<h1><project id></h1>

<p>{{ visits }} visits</p>

</div>

</body>

</html>


set project with:

gcloud config set project <project id>

(Note this needs to be run whenever you switch to working on a different project locally, to switch the datastore your local development environment is connected to, that is, to the project's datastore.)


set credentials 

gcloud auth application-default login

(This will launch a browser -- log in, and 'allow.' Note this will connect your local environment with the remote datastore. If you want a local datastore, you need to use the datastore emulator).


run locally with:

gunicorn -b :8080 main:app

      

deploy with:

gcloud app deploy --project <project id>

      

deactivate

Check your database on google cloud console. It's still there!


Saturday, September 18, 2021

Python and Javascript shared file on Google App Engine / Google Cloud Platform

The principle known as 'single source of truth' (SSoT) is quite important in computing, and is perhaps underappreciated. We need to write programs whose assumptions are consistent and easily discoverable. It's important to avoid the introduction of multiple sources for the same 'things': things that can be as simple as constants, and as complex as the definition of terminology within a theory of automation.

Sometimes, we need this to take place within a single shared file, accessible during runtime between the server and the client code, to express important editable values once.

I use local files in my development environment, to make changes to my webapp. In one important application, I don't need a makefile-like compiling stage. So, I test locally with a development server and a browser, as everyone does. But I want to have configuration files -- central, essential, important files -- which pretty much define everything critical in my application, both the universal and the contingent aspects ... and all the intermediate levels of detail needed to properly define those aspects of the work.

I call this an 'essence' or 'central description'. It is neither javascript (well, it's JSON here for technical convenience) nor python. It's just a configuration file. I've found that this compels me to maintain a quite useful habit: keep things modularized and parameterized at a higher level than code, making important ideas resuable as the application unfolds.

But how do you connect a Google App Engine Python server application (on Google Cloud Platform) to the javascript that it serves up? It seems like one should be able to use a shared JSON file easily. But that's not the case.

However, given that it's extremely important, here are the technical details of connecting them.

The shared file is shared.json.
The server-side python is called my_app.py.
The client-side javascript is my_app.js.
There's an app.yaml.
They are all in the same folder.
And I'm intentionally ignoring the html necessary to get to the javascript code, which could be embedded script, or a separate file, static, or generated ... that's all up to you. I don't know where you run your javascript, so I've just called it 'my_app.js'.
shared.json

{
 // a comment
 "name":"my_app",
 "things": {
 "one":"a"
 "two":"b"
 }
}


my_app.py

import webapp2
import io
import json

# this version of strip_comments 
# only works if the comment is defined
# by //, and // is the first non-whitespace
# on the line

def strip_comments(some_json):
 some_json = str(some_json)
 return re.sub(r'(?m)^[ \t]*//.*\n?', '', some_json)
    
class ReadAndRespond(webapp2.RequestHandler):
 def get(self):

 # get the file
 file = io.open("shared.json", "rb", buffering = 5)
 json_string = file.read()
 file.close()
 
 # strip the comments
 json_sans_comments = strip_comments(json_string)
 
 # load the JSON into a python dictionary
 dict_obj = json.loads(json_sans_comments)
 
 # get one of the values 
 my_things = dict_obj.get("things")
 
 # use it for a new JSON payload for the response
 # to the javascript request
 return_string = json.dumps(my_things, sort_keys=True, indent=4)

 self.response.write(return_string)

app = webapp2.WSGIApplication([('the_json', ReadAndRespond)])


my_app.js

 ...

 var fetch = eval( ajaxReturn('the_json') );
 var things = fetch;

 ...
 
 function ajaxReturn(xhr_url) {
    var return_string = '';
    $.ajax({
                dataType: "text",
                url : xhr_url,
                async: false,
                success : function (newContent) {
                   return_string = newContent;
                },
                error : function ()
                {
                }
    });
    return return_string;
 }
 
 ...

app.yaml

runtime: python27
api_version: 1
threadsafe: false

handlers:
- url: /shared.json
  static_files: shared.json
  application_readable: true
  secure: always
  redirect_http_response_code: 301
  
  - url: /.*
  script: my_app.app
  secure: always
  redirect_http_response_code: 301
  

Tuesday, March 23, 2021

Google Cloud Platform or Google App Engine 'gcloud app deploy' not updating your app? Debugger not working?

If you use python, and occasionally run your app locally, you'll notice some files that are generated: .pyc and index.yaml. If you accidentally 'gcloud app deploy' these files, especially when you have not run it locally for a while, they can break your application! Or drive you crazy!

In particular, they break the Google Cloud Console debugger (formerly the stackdriver debugger)  ... so while you're trying to figure out why your application doesn't work as expected, the debugger keeps telling you that you're setting logpoints on non-existent lines of code!

What has happened, is that Google is not checking to see whether the .py source file and the .pyc file match. It would be very nice if they did this. They could just check the timestamps. It would be easy. But they don't. 

And you keep changing your source code in trivial ways, perhaps adding some logging to see what's going on, but the live deployed application doesn't change. And the debugger doesn't work!

The fix is obvious: delete the .pyc and index.yaml files. Or put them in your .gcloudignore file.

I hope this helps someone. :-)




CHAQ: Central Handler, Action Queue

The evolution of cheq into CHAQ is pretty easy to explain. 'Events' are things that happen to you, and 'actions' are things that you do. The Central Handler in a javascript application is called by system events that we initiate in our own code, and then we let go, so the browser can do what it needs to. But before we let go, we check what actions we need to take. They are on a queue of these actions, which we've loaded within our own code. We process them, then pass control to the browser.

You can see this code, used to explain itself in a baby web app, at chaq.rocks.  

One other change -- our 'option-oriented programming', where the program itself is determined by a JSON structure, refers to the names of other JSON objects, not to functions. I found that I would otherwise not make the functions generally usable by other JSON instructions. So functions are only called by 'process'. Ultimately, for namespace sanity, our functions are injected into process, where they are called by the program described in JSON, which I call the 'essence', and which is really a not-javascript-specific declarative programming language 'structure', which can be used for any level of abstraction you like.

The point of the essence is to give people an opportunity to maintain clarity of ideas in the description of the program's activity. Coding beyond the barrier of incomprehensibility (and other desired qualities) is very common in software development. Anything that helps us to prevent that, needs to be explored further.

 

Monday, August 01, 2016

cheq: central_handler and event_queue

No matter how you look at it, even if it's hidden, every good browser-based JavaScript application must have a "central handler", and an "event queue". I call this necessity "cheq", as a mnemonic.

Why is this? Because a JavaScript program cannot hold onto (monopolize or block) the execution thread (the control flow of the browser's computational actions) and still make use of the essential services provided by the browser: rendering, user event handling, etc. We must pass control back to the browser, all the time, or nothing apparently happens.

But how do you do this, if you need your program to do "many things that are tied together", while passing control to the browser between each of these things? The answer, as I've said, is your own "event queue": a control channel under your control, which will persist while the browser is busy, say, rendering something for you. Every JavaScript programmer runs into this problem all the time: why isn't "X" appearing on the screen? Oh -- I didn't pass control back to the browser. This is especially obvious when you build animations.

If you have an event queue of your own, independent of the browser's event system, then you need a central_handler that manages that event_queue. Hence "cheq":


/* -----------------
 cheq.rocks 
 "cheq" means "central_handler event_queue".

 This needs to be at the heart of any browser-based 
 javascript application. It allows you to control your 
 program flow while cooperatively passing control to the 
 browser in order to render, handle events ... 

 The initial call from index.html looks like this:
     event_queue.push({'name':'initial_subhandler_name',
      'delay':2000});
     central_handler()

 Subsequent calls from inside the event look like:
     event_queue.push({'name':'some_subhandler_name',
      'delay':2000});
     central_handler()

 OUR EVENT QUEUE (so the browser regularly gets control):
  uses .push(x) to add to queue
  and .shift() to get the next event
*/
var event_queue = [];
var the_event = null;

// CENTRAL_HANDLER:
//  called by onload and setTimeout
function central_handler() {

    if (!(event_queue.length > 0)) {
 return;
    }
    the_event = event_queue.shift();

    // call event
    window[the_event.name]();

    // only loop until the stack is empty
    if (event_queue.length > 0) {
       setTimeout(function () {central_handler();},
    the_event.delay);
    }
}

// end of cheq.rocks
// -----------------


"Cheq" may be considered "the heart" of any JavaScript application, from one perspective. It's not necessarily the most useful idea for a program "heart", for comprehensibility. But ... maybe it is a useful central organizing principle. You never know until you try. So, I'm going to try. I'll evaluate this with my "smoothly unfolding sequence" approach, described at core memory, making good use of the Life Perception Faculty in the human brain as a means of judgment, and see if I can maintain "explanatory" reasonableness as well. My explorations in maintaining a good development structure, from this starting point, will be here: cheq.rocks