Days 1 and 2 of TechEd Barcelona I wrote up already:
https://blogs.sap.com/2018/10/25/tech-ed-barcelona-not-much-detecting-2-days-out-of-3./ so what follows is from the last day (Thursday).
Above are 2 tweets from the first session of the day (no pictures, for once) concerning data privacy. While I had viewed the slides ahead of time, hearing the content with full commentary by someone expert in the field made my early arrival on the last day worthwhile. Taking a journey through the past practices along with reference to supporting literature, Daniel Bernau moved to the current/state of the art in protecting data from misuse/abuse ("differential privacy"). Examples included heath records as well as financial.
One particular data obfuscation technique concerned a true/false questionnaire that included a coin-flip type randomization, as I understood the method, of part of the answers. At first this made no sense, but then I gathered that not every answer would be randomized, so there would be a known amount of 'jitter' added.
Daniel claimed that there are mathematical proofs demonstrating that original data could not be reconstituted from the mangled data. As someone in the audience asked, yes, as long as the un-mangled data cannot be reached.
Random link:
http://datadrivensecurity.info/ (sadly, their https: connection is not valid).
Stretch link: In Barcelona, we visited an art museum [MACBA] exhibit of a series of photographs from New York City real estate that was not allowed to be shown in the 1970s at the Guggenheim.
https://www.macba.cat/en/shapolsky-et-al-manhattan-real-estate-holdings-a-real-time-social-system-as...
In the days of electric typewriters, carbon paper, and 4-drawer file cabinets, intrepid researchers could suss out complex financial holdings that appeared to be deliberately constructed to hide the truth of where the money goes. Today, with digitization and deep search engines, hiding the links would seem to be harder. Adding in anonymization layers may make data more private. One question I'd have is whether protecting individual identifying data is the only goal,or whether exposing abuse (whether personal, financial, or criminal) is also worthy.
-- "The most dangerous man to any government is the man who is able to think things out for himself, without regard to the prevailing superstitions and taboos." [cit.
https://en.wikiquote.org/wiki/H._L._Mencken ]
Beyond the bewildering array of data processing/manipulating tools, I pulled out a question related to the above section on data anonymization - a data "masking" component. On clarification, this is a cruder obfuscation function that acts a bit like the graffiti spray can, overwriting some or all of specific fields, not applying sophisticated mathematical models that could preserve statistical distributions that might be needed.
Obligatory (semi) required link to a related press release: https://news.sap.com/2016/09/sap-welcomes-altiscale-provider-of-high-performance-big-data-as-a-servi... a bit dated, perhaps, in the fast paced acquisition space. Now renamed something like "SAP cloud platform big data services". You can guess the rest.
This is my brain. After an espresso, and with a few minutes to be productive between sessions. My take at first was a cross between "Meet The Parents" and the latest U.S. Senate "Advise And Consent" stupid show, though this was intended to be a product in the pipeline to improve user experience by recording happiness (as denoted by particular brainwave patterns) in the workplace. Um. No thanks.
Perusing the Mendix presentation by
michal.keidar , more on this topic later.
Swung by the Microsoft booth for a quick chat (and wave) with
jrgen.thomas2 who's presented on SAP topics for years, primarily SQL*Server but of course now about that other database. And Azure cloud. We had a good conversation about the culture shock of long-time Windows administrators coming into the new SAP world where Linux underlies the hardware and database.
A couple of shots from a meet-up session on ongoing development of testing platforms for data integration (more commonly known as PI/PO to many). As it was a whiteboard session with no supporting documents I don't have anything to share, other than there's no volume/stress testing included, and there are some nice-to-have bells-and-whistles (such as OData) and perhaps, maybe an improved UI with a digit.
The primary ask from experts in the audience (I counted 3 other SAP Mentors) was that somehow, project management be automatically informed that data integration testing needs to be in the plan from the beginning, and not just, yes, we might need to test.
Thu 12:30 - 13:30
CodeJam (Mini-Edition) 1, Showfloor
"... Check out Analyze Your Code for Vulnerabilities with Vulnerability Analyzer Tools at SAPTechEd Barcelona 2018." SEC607 in SAP code jam mini-ed "room" 1.
I had not experiences the mini-code-jam/hands on sessions before, though I have been to a Co de Jam and to a few full-scale hands-on sessions. Instead of jumping on the keyboard, though, I sat on an observer stool.
The second picture from the session, above, is sadly a bit fuzzy. I wanted to capture the speaker, the screen, the audience, and their workstations, using only an auto-focus digital camera (one
craig.cmehil described as "ancient ;-). Here's a better crop of the SQL injection technique to be scanned for:
The vulnerability shown is someone altering the underlying SQL with text values. Maybe contrived, but then again, not outside the realm of possibility. Writing the logic to find these kinds of holes is challenging, as I understand from participating in an ABAP code logic review we commissioned years before the companion code review package was available from SAP (customers built their own because SAP did not have what we needed).
I can always count on
thomas.jung to draw a crowd; no exception this week. His mini-hands on had people in the aisles, back and sides.
=
=
Before I went to my hands-on sit-down, 2 hour tech dinner, I recorded an interview for the SAP TechEd Live crew.
jason.lax and I had corresponded about the topic and the techniques. I guess it all turned out fine.
I had signed up for the DAT265 SAP Enterprise Architecture Designer, as an area I had some background familiarity with. In the past, many hands-on sessions would not be beneficial given the corporate use of specific tools/versions, and the chance to see something brand new was only appealing to the sense of challenge over applicability,
In the few years since my last hands-on sit-down, knock-down, drag-out, a few things have changed. First, no paper copies anywhere, either the background instructions or the exercises. Good from my side (less waste), and it became usable as 2 monitors were available (laptop and standalone). With minimal instructions as to where things were, how to login, and how not to step on your peer's data sets, we were off to the races.
One personal glitch I encountered was lack of familiarity with what I'd call a "German keyboard" where the Z and Y keys were switched from my usual places. Not QWERTY but QWERTZ (the first several letters on the top row of alphas). But that was only a minor annoyance since typing in specific file names wasn't an exercise requirement (I hope). Should not make any difference if I called the saved data column or table or whatever entity "CO" or "COMPANY", for example. However, I think the bigger fail was not finding the underscore key, and a few other non-alpha/non-numeric entries.
Other parts of the online environment worked well, such as web browser on one screen and adobe acrobat on another, so I could generally copy/paste necessary data content and avoid keystroke errors. Particularly highly specific values like port numbers (e.g. "51044". A diversion into the WinRAR application was needed, apparently, for unpacking an archive into place. That seemed a little clunky, but there are more ZIP/UNZIP tools than you can shake a cursor at, including Windows built-ins.
How did I make out? Well. judging from the scroll bar on the side of the instructions, about 2/3 of the way through from start to finish. I needed help once with the icons, having picked a relationship instead of an attribute, or something. So the tool was relatively easy to pick up with minimal experience (a value judgment there!). And the error checking/ongoing feedback was helpful enough to get me that far. I think if I didn't have 3 other places I wanted to attend simultaneously and had enough patience to wait for one of the two instructors to stop over I could have made it to nearer the end.
As it turns out, I failed the build. Good thing this wasn't an actual test.
The best I could gather from the console logs mentioned was syntax errors based on the feedback "unexpected token" (one being "<" and the other being ";"). My take is I dropped these into file or folder names accidentally due to the keyboard differences. If that is correct, maybe I should have been given an earlier warning ("Windows [or Linux] won't let you use that character in a file name").
The classic teacher/student::parent/child dialogue that would have ensued next would be something like "and what have we learned from this, Johnny?" And I'd say, probably nothing from the user interface / data manipulation zone. Many software tools I've used have some hidden or unexpected rules that you either learn to avoid after one or two flubs, or you keep hitting them and wonder why the machine/software just won't adapt.
Bigger picture, I'm not certain where the architect tool sits in an enterprise software world. The instructor threw out a comparison to the SAP Web IDE, so my expectation would be the architecture designer is more symbolic and less code-heavy (maybe I should have shouldered my way through the crowd around Thomas J). A reference to
git code management indicated that the architect designer isn't a total standalone package
There were comments about conceptual data modelling versus physical data modelling, but in reference to urban land use ("James Martin/1960s", still checking references to this; maybe
http://www.chesapeake.org/conowingo_model/downloads/JLMartin_3_2016%20Vitae.pdf is applicable). Probably that was a theory versus practice view, as the split between computer data modelling is more claimed than real, in my experience.
One aspect mentioned in the overview was using the tool to reverse engineer a legacy application. And I don't think this is a full make-something-out-of-nothing proposition, more like an incremental "we've lost or never had the source code to this specific application, or it's written in a language we don't want to pay anyone to decipher" and then "we know enough about how it works in the old system, please build a parallel version in the new system" and then we can create a better version based on the discovery process. More interpolation than extrapolation.
Cutting out of the hands-on class room, I was privileged to learn things about how SAP uses their own tools internally. I've blocked the screen due to non-disclosure needs, and can at least show a few Mentors having give-and-take with SAP teams. At one point, I learned a new perspective on some of the data anonymization topics I reviewed in the first Thursday session, using hotel guest information as the jumping-off point. "Loyalty" programs offer discounts to frequent customers; how would companies store and use personal data in a responsible way?
After the sessions, demos, hands-on, and one-on-ones, the SAP Mentors had their own wrap-up, to go over the good-bad-and-ugly of the shortened week. One primary goal is feedback to SAP product managers; service in the community is also in our charter. Expect to see more of the SAP Inside Tracks, in some new places. If you want to make that happen in your community, let us know.
Stammtisch or Round Table? It depends.
Next to last shot of the day - leaving the TechEd show floor with only site staff security around. It's almost as quiet as it was early Tuesday.
And the pack-and-go teamsters ready to roll up the carpets and split...
Final on-site image - our wonderful hosts who kept a fresh supply of drinks and snacks available (yes,
oliver it was instant coffee; and yes, it's better than what my hotel supplied, which was none).
2 after-event dinner part shots - could not pick which one I like better...
And me on sap teched live tv, with Michal
Confession: I had a list of questions to ask, which went out the window as soon as the camera started rolling, and I winged several topic areas that occurred to me on the fly. My notebook came in only partly handy, so it was more of a prop than a script. Michal and I did not rehearse, or even chat much before hand, so take this in the spontaneous fashion it was created in.
Last word on the SAP Usability session: my co-volunteer and I could not agree who was to be the participant and who was the observer. We ended the impasse with a one-round contest of rock-paper-scissors. Seems like that translates universally.
The end; see you next time!