James Reads


Day of Aug 20th, 2019

  • Manhattan DA Cy Vance Made Google Give Up Info on Everyone in Area in Hunt for Antifa After Proud Boys Fight

    DRAGNET Manhattan DA Made Google Give Up Information on Everyone in Area as They Hunted for Antifa …and investigators ended up targeting the wrong people, at least in part from that information. Updated 08.15.19  4:35PM ET  / Published 08.13.19  12:03PM ET   When Gavin McInnes—founder of the violent, far-right group The Proud Boys—spoke to a Manhattan Republican club last October, the neighborhood response was less than welcoming. Protesters took to the normally sedate Upper East Side block with chants and spray paint. The Proud Boys responded with fists and kicks. Nearly a year later, as the assault and riot charges against four Proud Boys go to trial, prosecutors revealed that they had turned to an alarming new surveillance tool in this case: a reverse search warrant. The Manhattan District Attorney's Office admitted it demanded Google hand over account information for all devices used in parts of the Upper East Side. They didn’t do this to find the Proud Boys; they did it to find Antifa members. Reverse search warrants have been used in other parts of the country, but this is the first time one was disclosed in New York. Unlike a traditional warrant, where law enforcement officials request information on a specific phone or individual, reverse warrants allow law enforcement to target an entire neighborhood. Police and prosecutors create a “geofence”—a map area—and demand information on anyone standing in the zone. This flips the logic of search warrants on its head. Rather than telling service providers the name or phone number of a suspect, reverse search warrants start with the location and work backwards. It’s a big change. Depending on the size and location of the geofence, a reverse search warrant can easily target hundreds or even thousands of bystanders. That scale is what makes reverse search warrants so enticing to law enforcement and so concerning to civil liberties groups. One concern is that the more broadly law enforcement uses surveillance, the higher the risk for “false discovery.” That’s a clinical way to say that the more people you spy on, the more innocent people will wrongly go to jail. The phenomenon is well-documented in the sciences, where researchers have long known that “high false discovery rates occur when many outcomes of a single intervention are tested.” Essentially, when you look for too many patterns at the same time, you increase the danger that the data will fool you. When police officers request the data for hundreds or even thousands of devices, there’s a higher chance that they’ll wrongly think that one of those bystanders is a suspect. This isn’t just theoretical. That’s what Jorge Molina discovered in 2018, when Arizona detectives wrongly arrested him for a brutal murder, jailing him for nearly a week before he was exonerated. Officers demanded that Google hand over information on every single laptop, phone, tablet, and smart device in a two-block area. We don’t know how many accounts that includes, but it’s no surprise that while sifting through that many devices that they quickly found a “match.” Only he was innocent. In response to the Manhattan DA’s reverse search warrant, Google provided information that investigators used—along with images given to a private facial recognition company—to target two people who turned out to be innocent bystanders. Thankfully, unlike in Molina’s case, the two “matches” in Manhattan were never arrested—and the Antifa members have not been identified, even as several Proud Boys have stood trial. But with the seal broken now in Manhattan, there are likely to be more geofence warrants and more false discoveries. While a judge needs to sign off on a reverse warrant, that formality provides little protection to the public. A traditional warrant application asks for information about the individual being targeted and the reasons they are suspected. With reverse warrants, judges don’t even know how many people’s data will be compromised. They simply don’t have enough information to do their job. It’s also unclear how judges will evaluate reverse warrants around sensitive sites: political protests, houses of worship or medical facilities, among others. The practice is even more alarming when you consider the ways that ICE and other federal agencies could use a reverse warrant to pursue their deportation campaigns and target American immigrants. None of this is to say that reverse search warrants are unique, they are just the latest example of how the surveillance capitalism that powers tech firms can become a tool for the government. Maybe some users who happily hand their data to the tech giants will second guess that choice when they realize how quickly their digital sidekicks can morph into a big brother. Albert Fox Cahn is the executive director of The Surveillance Technology Oversight Project at the Urban Justice Center, a New York-based civil rights and privacy organization. On Twitter @cahnlawny.  Source: Manhattan DA Cy Vance Made Google Give Up Info on Everyone in Area in Hunt for Antifa After Proud Boys Fight

    Read at 11:34 am, Aug 20th

  • What we’ve learned about police shootings 5 years after Ferguson

    In 2015, The Washington Post launched a real-time police shooting database to record and analyze every fatal shooting by an on-duty police officer in the United States. In the 4½ years since, The Post has tracked nearly 4,400 fatal shootings. This is what we found:

    Read at 01:59 am, Aug 20th

  • Dorian Johnson, witness to the Ferguson shooting, sticks by his story

    FERGUSON, Mo. — Michael Brown was still lying dead in the street when his family gathered in the kitchen of a nearby apartment to hear from Dorian Johnson, the friend who saw what happened.

    Read at 01:49 am, Aug 20th

Day of Aug 19th, 2019

  • Give children the vote — Crooked Timber

    Looking at the array of ignorant and vindictive old men attacking Greta Thunberg and other young climate activists, the case for lowering the voting age is just about unanswerable. Anything that could be urged in justification of stopping 16 year olds, as a group, from voting, is equally applicable to those over 60 (a group to which I belong). Over 60 voters are, on average, poorly educated (the school leaving age in Australia was 15 when they went through and I assume similar in most places), and more likely to hold a wide range of false beliefs (notably in relation to climate change). Worse, as voters the over 60s have ceased to act, if they ever did, as wise elders seeking the best for the future. Rather (on average) they vote in a frivolous and irresponsible way, forming the support base for loudmouthed bigots and clowns like Trump, Johnson and, in Australia, Pauline Hanson (the last of whom, unsurprisingly, supports an increase in the voting age). Substantively, they respond to unrealistic appeals to nostalgia, wanting to Make America Great Again, and restore the glories of the British Empire, while dismissing concerns about the future. If my age cohort were to be assessed on the criteria applied to 16 year olds, we would be disenfranchised en masse. Of course, we can’t do that kind of thing in a democracy,. That’s why we should act consistently with the core democratic principle that those affected by a decision should have a say in making it, unless they are absolutely disqualified in some way. In my view, that makes an open-and-shut case for lowering the voting age to 16. But where should we stop? If we set the bar at the level of emotional maturity and intelligence shown by say, the crowd at a Trump rally, most 12 year olds would clear it with ease. So, how about giving everyone a vote? For young children, that would amount to giving parents an extra vote, though it’s worth noting that opponents of womens’ suffrage made the same claim about husbands. In any case, the assumption that parents would vote in their children’s interest seems much more defensible than the idea that the old, as a group, will vote unselfishly about decisions (Brexit, for example, or wartime conscription) that will have little effect on them, but drastic consequences for the young. More importantly, the age at which young people stop doing as their parents tell them is well below 18. Allowing them to engage directly in the democratic process would be an unambiguously good thing, whether or not they chose more wisely than their elders. Source: Give children the vote — Crooked Timber

    Read at 12:50 am, Aug 19th

  • Stop the Slaughter of Our Children With These Weapons of War

    Since the massacre at Sandy Hook Elementary School in which 20 of our first graders were slaughtered with an AR-15-style rifle, I have pondered despairingly how long it would take for America to put an end to the killing generated by these weapons of war.

    Read at 05:00 pm, Aug 19th

  • Trump Weighs New Stance on Guns as Pressure Mounts After Shootings

    WASHINGTON — In the wake of two mass shootings, the divisive politics of gun control appeared to be in flux on Thursday as President Trump explored whether to back expanded background checks on gun purchasers and Senator Mitch McConnell, the Republican majority leader, signaled that he would at le

    Read at 01:41 pm, Aug 19th

  • Indexes in PostgreSQL — 4 (Btree)

    We've already discussed PostgreSQL indexing engine and interface of access methods, as well as hash index, one of access methods. We will now consider B-tree, the most traditional and widely used index. This article is large, so be patient.

    Read at 01:37 pm, Aug 19th

Day of Aug 18th, 2019

Day of Aug 17th, 2019

  • Jay-Z Helps the NFL Banish Colin Kaepernick - The Atlantic

    Now he’s in business with the league. Kaepernick’s girlfriend, Nessa Diab, wrote on Twitter that Kaepernick didn’t speak with Jay-Z before he brokered his deal with the NFL. Jay-Z said yesterday that he spoke to Kaepernick on Monday, but he wouldn’t divulge how their conversation went. Jemele Hill: Kaepernick won. The NFL lost. A source close to Kaepernick, speaking on the condition of anonymity because of the sensitivity of the topic, told me, “It was not a good conversation.” But it was all smiles yesterday between Jay-Z and Goodell. “We don’t want people to come in and necessarily agree with us; we want people to come in and tell us what we can do better,” Goodell said at the press conference. “I think that’s a core element of our relationship between the two organizations, and with Jay and I personally.” The financial arrangements have not been made public. But whatever the numbers, the NFL’s new partnership with Jay-Z is a huge win for the league. Some of the biggest celebrities in the world have voiced their support of Kaepernick, saying they would boycott the NFL until Kaepernick is back in the league. Now that the NFL has Jay-Z’s blessing, it’s conceivable that some of those entertainers who distanced themselves from the NFL might change their mind. Jay-Z has given the NFL exactly what it wanted: guilt-free access to black audiences, culture, entertainers, and influencers. NFL officials must have been bothered by how much Kaepernick was discussed during Super Bowl week earlier this year. Not only did Goodell have to answer more questions about why Kaepernick still isn’t receiving any interest from NFL teams, but there had also been a number of reports that the league was having a hard time finding performers for its halftime show. Some stars, including Rihanna and Cardi B, reportedly turned down the opportunity to appear at the event show out of allegiance to Kaepernick. Other celebrities, such as the comedian Amy Schumer, publicly pressured the Maroon 5 singer Adam Levine to pull out of his performance. The Reverend Al Sharpton, the civil-rights leader, blasted the rapper Travis Scott, who performed with Levine. “You can’t fight against Jim Crow and then go sit in the back of the bus,” Sharpton told TMZ. Ironically, one of the people who also advised Scott not to perform at the Super Bowl was Jay-Z. Yesterday the Roc Nation founder said he’d told Scott he shouldn’t perform at the Super Bowl because he would be playing “second fiddle” to Maroon 5. It had nothing to do with Kaepernick. Clearly Jay-Z’s support of Kaepernick only went so far. Regardless, why would Jay-Z waste any of his enormous social and cultural capital on the NFL when he doesn’t need the league’s platform, money, resources, or validation? Read: The war on black athletes I get that Jay-Z might see this as an opportunity for artists to connect with the NFL’s immense audience. He could also offer some incredible insight and direction to the league on the social-justice front, since he’s been actively engaged in such work for a long time. I also understand that, to become hip-hop’s first billionaire, Jay-Z didn’t always have the luxury of avoiding relationships and partnerships with people he disagreed with or disliked. Source: Jay-Z Helps the NFL Banish Colin Kaepernick – The Atlantic

    Read at 04:26 pm, Aug 17th

  • Israel Is Barring Rashida Tlaib and Ilhan Omar From the Country, Official Reportedly Says

    An Israeli official just confirmed that U.S. Congresswomen Rashida Tlaib and Ilhan Omar will be barred from entering the country ahead of a planned trip.

    Read at 11:20 pm, Aug 17th

  • Obama Reportedly Warned Biden About 2020: ‘You Don’t Have to Do This, Joe’


    Read at 11:19 pm, Aug 17th

  • Insurance Companies Are Paying Cops To Investigate Their Own Customers

    When police showed up at Harry Schmidt's home on the outskirts of Pittsburgh, he thought they were there to help. He was still mourning the disappearance of the beloved forest green Ford F-150 pickup that he’d customized with a gun storage cabinet, and he hoped the cops had solved the crime.

    Read at 11:18 pm, Aug 17th

  • Many Democrats Love Elizabeth Warren. They Also Worry About Her.

    COUNCIL BLUFFS, Iowa — Senator Elizabeth Warren has built the most formidable campaign organization of any Democratic presidential candidate in the first nominating states, raised an impressive $25 million without holding high-dollar fund-raisers, and has risen steadily in Iowa and New Hampshire p

    Read at 10:25 pm, Aug 17th

  • Bad headline, small changes at the New York Times

    Separating from the Times was not a decision she took lightly, Walsh said. “I’ve put this off for almost 3 years. They are blowing their coverage of this crisis. I’m out.” I’m still in. I consider myself a Times loyalist.

    Read at 05:56 pm, Aug 17th

  • A Friend to Israel, and to Bigots: Viktor Orban’s ‘Double Game’ on Anti-Semitism

    BUDAPEST — In late November, the office of Hungary’s far-right prime minister, Viktor Orban, announced it would donate $3.4 million to causes fighting anti-Semitism in Europe. The next day, a magazine controlled by Mr.

    Read at 05:43 pm, Aug 17th

  • Netanyahu advisers hatched anti-Semitic conspiracy against George Soros

    Israeli leaders helped birth today’s most notorious anti-Semitic conspiracy theory. That’s the conclusion from recent reports about the origins of a right-wing plot against liberal Jewish billionaire George Soros.

    Read at 05:36 pm, Aug 17th

  • Netanyahu Was Quick to Denounce Rival's 'anti-Semitism.' Here Are 5 Times He Stayed Silent

    In an outraged tweet, Prime Minister Benjamin Netanyahu declared Monday that Yair Lapid — one of his rivals in Israel’s September 17 do-over election — “must not be allowed to be prime minister” after the Kahol Lavan No. 2 posted a satirical video “worded in an anti-Semitic tone.

    Read at 05:28 pm, Aug 17th

Day of Aug 16th, 2019

  • WordPress Theme Review Team Scraps Trusted Authors Program Due to Gaming and Inconsistent Reviews – WordPress Tavern

    After several months of discussion, WordPress.org’s Theme Review Team has decided to discontinue the Trusted Authors (TA) Program that launched in April 2018. The program, which was controversial from its inception, allowed certain authors to bypass the normal theme review queue after demonstrating an ability to submit themes with fewer than three issues. Trusted Author theme submissions went to their own dedicated queue that was handled by team leads. The objective of the program was to streamline the review process and lessen the burden on reviewers. When it failed to deliver the intended results, the Theme Review team leads made a unilateral decision behind closed doors, implementing a change requiring TA participants to join the team and perform a minimum number of reviews in order to continue having their own themes fast tracked through the review process. This was loudly decried by other members of the Theme Review team who were blindsided by the decision. “We are removing the Trusted Author Program,” team lead William Patton announced in the most recent meeting. “It has not fulfilled the intended plan and has caused more problems than it is solving.” Fellow team lead Sandilya Kafle outlined the reasons in a post published today. The entrance requirements for the program did not ensure that participants were truly “trusted” authors, as many had to be removed for gaming the system. Reviewers also reported that there was a group of people releasing clones of themes every week. “We got lots of help from the TA authors – for which we’d like to thank them,” Kafle said. “However, there was still gaming from some of the authors – which resulted in their removal from the TA program. One of the intentions of the TA program was to reduce the gaming by the use of multiple accounts. However, we still saw some authors having multiple accounts so this intention was not realized though the program existing.” The TA program’s entrance requirements also did not ensure that participants were prepared to review themes at a high level, which resulted in inconsistent reviews. “We strongly believed that TA members were highly familiar with the requirements but we found that was not the case for all of them,” Kafle said. “Additionally, some authors did not feel confident enough in their own understanding of all requirements to perform reviews and set themes live. Instead many TA reviews went to the admin queue after approval. This was an indicator that the quality of the themes by TA’s may not be as high as expected.” Most of the Theme Review team members present in the meeting were generally agreed on shutting the TA program down. Alexandru Cosmin, the former team lead who introduced the program, was the only vocal outlier, whose acrid responses to scrapping the program reflect a long-standing frustration with the slow queue. “Honest opinion, and I could bet on this: by the end of the year we’ll have 5-month queues and multi-accounters,” Cosmin said. “We’ll see how fair it will be when you have guys with 15 accounts and authors complaining in the main chat about how long the queue is.” Today’s decision to discontinue the TA program restores the natural order to the queue, with all theme authors receiving the same treatment. Tying an incentive program to the review system was ineffective for taming the queue. Long queues and gaming the system have proven to be continual struggles for the Theme Review Team, but the existence of these problems underscores the significance of the official themes directory for theme shops. Companies continue to use WordPress.org to gain users for their commercial versions, and the directory remains an important distribution channel for WordPress themes. Would you like to write for WP Tavern? We are always accepting guest posts from the community and are looking for new contributors. Get in touch with us and let's discuss your ideas. Like this: Like Loading... Related Source: WordPress Theme Review Team Scraps Trusted Authors Program Due to Gaming and Inconsistent Reviews – WordPress Tavern

    Read at 03:37 pm, Aug 16th

  • Introducing the New React DevTools – React Blog

    Introducing the New React DevTools We are excited to announce a new release of the React Developer Tools, available today in Chrome, Firefox, and (Chromium) Edge! What’s changed? A lot has changed in version 4! At a high level, this new version should offer significant performance gains and an improved navigation experience. It also offers full support for React Hooks, including inspecting nested objects. Visit the interactive tutorial to try out the new version or see the changelog for demo videos and more details. Which versions of React are supported? react-dom 0-14.x: Not supported 15.x: Supported (except for the new component filters feature) 16.x: Supported react-native 0-0.61: Not supported 0.62: Will be supported (when 0.62 is released) React DevTools is available as an extension for Chrome and Firefox. If you have already installed the extension, it should update automatically within the next couple of hours. If you use the standalone shell (e.g. in React Native or Safari), you can install the new version from NPM: npm install -g react-devtools@^4 Where did all of the DOM elements go? The new DevTools provides a way to filter components from the tree to make it easier to navigate deeply nested hierarchies. Host nodes (e.g. HTML <div>, React Native <View>) are hidden by default, but this filter can be disabled: How do I get the old version back? If you are working with React Native version 60 (or older) you can install the previous release of DevTools from NPM: npm install --dev react-devtools@^3 For older versions of React DOM (v0.14 or earlier) you will need to build the extension from source: git clone https://github.com/facebook/react-devtools cd react-devtools yarn install yarn build:extension Thank you! We’d like to thank everyone who tested the early release of DevTools version 4. Your feedback helped improve this initial release significantly. We still have many exciting features planned and feedback is always welcome! Please feel free to open a GitHub issue or tag @reactjs on Twitter. Source: Introducing the New React DevTools – React Blog

    Read at 03:13 pm, Aug 16th

  • JavaScript: async/await with forEach() - codeburst

    Why have I been blocked? This website is using a security service to protect itself from online attacks. The action you just performed triggered the security solution. There are several actions that could trigger this block including submitting a certain word or phrase, a SQL command or malformed data. Source: JavaScript: async/await with forEach() – codeburst

    Read at 03:12 pm, Aug 16th

  • Simplifying Gatsby Unit Testing using a Jest Preset

    Recently I have been expanding the codebase for my website and blog in order to have a more robust web presence. Part of this of course includes unit testing the source code using Jest.

    Read at 11:13 pm, Aug 16th

  • How to shallow render Jest Snapshot Tests

    If you are using Snapshot Tests with Jest for your components, there are a few pitfalls you have to be aware of. Two of them are very likely to apply to your written tests as well:

    Read at 01:28 am, Aug 16th

  • MTA Police To Take Action Against Homeless People On The Subway

    This weekend, New York state will deploy police officers and other employees to reduce the number of homeless people in the subway system, a function normally undertaken by the city.

    Read at 01:22 am, Aug 16th

  • Israeli Decision on Omar and Tlaib Inflames Politics in Two Countries

    JERUSALEM — Under intense pressure from President Trump, Prime Minister Benjamin Netanyahu’s government on Thursday barred two members of the United States Congress from entering Israel for an official visit, reversing a previous decision to admit two of the president’s most outspoken critics.

    Read at 01:19 am, Aug 16th

Day of Aug 15th, 2019

  • Logic-less JSX | Jonathan Verrecchia

    Logic-less JSX HTML is a declarative language with great readability. JSX, and templating engines in general, give us the power to mix logic and HTML. We've all experienced how JSX can become difficult to understand as our components grow in complexity. This article presents approaches that we can use to keep the mess out of our JSX, and make our code more readable and maintainable. Let's take a look at a simple example. Here is a design and its markup: <ul> <li><a href="url/animal/1">Dog</a> - 4 legs - Friendly</li> <li><a href="url/animal/2">Bird</a> - 2 legs</li> <li><a href="url/animal/3">Snake</a> - 0 legs - Unfriendly</li> <li><a href="url/animal/4">Centipede</a> - ? legs - Not enough data!</li> </ul> And here is the data that we are going to use to render it: A first intuition to create the component is to use the properties of the animal objects from the data as props: Note: Turning numbers like into strings for rendering is convenient to avoid accidental falsy conditions when their value is . I use Lodash's . Here we've got some logic mixed with the markup and it's already quite messy, even for a small component like this. Wouldn't it be nicer to get rid of most of that logic and have the following more declarative JSX instead? In order to use this lighter JSX, we need to transform the raw data into render data. Where to transform the data Our components might need the raw data for things like state management, styles, or handlers, so in my opinion it makes more sense to pass that raw data as props and have the transformation done within the component, rather than outside of it. Here are 4 different approaches that we can use to transform the data within the components. They all have pros and cons, so it's up to your personal preference. Approach 1: Variables An intuitive approach to move some logic out of our JSX can be to create some intermediate variables in the body of the function of the component: The strength of this approach is that it is pretty simple to understand, and doesn't introduce any new concept. The body of the function can get pretty big if there is a lot of logic, but at least that logic is right there, accessible and clear. Note: What I do in is a pseudo inline . If you use this you should keep in mind that any value you pass to the switch is converted to a string. Booleans, numbers, , and included. Also, if the right side of the switch cases produce side-effects, you need to put them in functions and execute the function at the end of the switch. See my other example below. Here is the CodeSandbox of this implementation. Approach 2: Second component We can move that logic into a second component that will transform the props, and possibly do other smart things such as handling a state. I chose to add a suffix to the markup component. It looks like this: The strength of this approach is the isolation of the component, which can be used for pure logic-less rendering in a Storybook. It also has a clean separation of concerns, and you can put those components in 2 different files if that's what you want. The downside of the intermediate component is that it can be confusing at first. It also adds one extra depth level per component which is a bit annoying when using React Devtools, and might have some performance impact that I haven't tested. If the components have a state, I'd favor putting that state in the "transforming" component rather than the one, to keep the latter free from any logic. Here is the CodeSandbox of this implementation. Note: An alternative implementation is to use the HOC from Recompose (library currently unmaintained). It wouldn't be possible to use only if you want to hold a state though, you would also need a HOC like . Approach 3: Function If you extract the transforming logic into a function, you get this: This version is a bit of a mix of the last two. It is simpler than introducing a new component, but doesn't give you the option to isolate the logic-less JSX code, and the function cannot hold a state. Its special feature however is that unlike the previous approaches, this function can be used outside of the context of React, by your server for instance, or if you switch to a different UI library than React one day. As you can see, if we deconstruct our render data, and if our line is too long, we might have to place each property on a single line, which takes a lot of space (at least that's how Prettier would format it). The first 2 approaches don't have this problem. Here is the CodeSandbox of this implementation. Approach 4: Class This is very similar to the function approach, except that it is using a class: The main difference with the function implementation is that with a class, the transforming functions are only executed when those properties are accessed, which makes it more performant, at the cost of being more verbose. But if you intend to use this render data outside of React, to get the of an animal on the server for instance, this is a more optimized choice than the function. Here is the CodeSandbox of this implementation. 🎉 Special thanks to my friend and ex-colleague from Yelp Benjamin Knight, who came up with the class approach, reviewed this article and helped me improve it. He also coined the term "render data" that I'm using here. What goes where – ⚠️ opinion I am still trying to figure out what works for me in terms of what I should put in the JSX and what I should extract to the render data, but here is my current take on it. What goes in my JSX Code is clearer than words so there you go: If conditions are more complex than this, then I would probably put them in the render data. This list is very likely to change. What goes in the render data I guess most logic that derives the props and ends up going into the JSX could be "render data". It's pretty much just helpers to keep your JSX as lean as possible. It's difficult to make an exhaustive list of that one. However I think it is a good idea to not put any JSX code in the render data. It can be tempting to do something like and inject in the JSX, but if we do this, we break the separation of concerns and the features that come with it. Instead, keep the markup in the JSX, or create a new component and use . As pointed out by /u/nschubach, declaring and (as well as ) in the render data does break the separation of concerns. It is similar to having markup there. So a purer version of the approach 1 would be: Depending on how much separation of concerns you want, you might be okay with having some strings defined in the render data though. Another thing that should probably not go in the render data is styles. Even if they are based on props. Styles I try to have as little styles in the JSX as possible, because just like logic, they can make a big mess in our JSX. And the whole point of logic-less JSX is to keep the JSX lean. Note that I am not only referring to inline styles, but also CSS-in-JS libraries that support writing styles in the JSX, like Emotion. Instead, I prefer using CSS-in-JS classes. In my opinion, an essential feature for a CSS-in-JS library is to apply styles based on props. Styled Components, Emotion (with Styled Components), JSS, or Material UI support it for instance. With that feature, no need to put styles in the render data. CSS-in-JS brings encapsulation and styles inheritance to CSS. So I think now is a good time to switch back to the old way of writing styles, with "semantic" classnames such as instead of things like . This helps keeping our JSX light too. But that's a very big debate for an other day. Closing words Depending on your project, an approach or the other might work better for you. My personal preference goes to the first version, with variables in the body of the component. It is less modular than the other ones but it is the clearest and most compact, and should be enough in common cases. But if I work in a very modular context, where front-end developers or web designers focus on the front of the front-end (HTML and CSS), the second approach with the pure would get my vote, as it provides the cleanest separation of concerns. What's your personal preference? Is there something that I am missing? Let me know on Reddit or Twitter. I'm always happy to have my mind changed. Posted on August 13th, 2019 Back to my site Source: Logic-less JSX | Jonathan Verrecchia

    Read at 09:56 pm, Aug 15th

  • Michigan Democrat Tlaib Endorses One-State Solution – The Forward

    (JTA) — A Michigan Democrat who is all but certain to become a congresswoman said she would “absolutely” vote against military aid to Israel, sparking criticism from a Jewish Democratic group. The candidate, Rashida Tlaib, also said in an interview that she favors a one-state solution to the Israeli-Palestinian conflict, as opposed to a two-state solution that would establish a Palestinian state alongside Israel. Tlaib, the daughter of Palestinian immigrants, recently won the Democratic nomination in Michigan’s 13th Congressional District. The Republicans are not running a candidate in the Detroit-area district. She is one of two Muslim women likely to be the first to be elected to Congress. The other is Ilhan Omar, who handily won her Democratic primary Tuesday in a Minneapolis-area district. Asked whether she would consider slashing military aid to Israel in a Monday interview with Britain’s Channel 4, Tlaib responded, “Absolutely, if it has something to do with inequality and not access to people having justice. For me, U.S. aid should be leverage. I will be using my position in Congress so that no country, not one, should be able to get aid from the U.S. when they still promote that kind of injustice. “So much is about ‘let’s choose a side,’” she continued, opining on the Israeli-Arab conflict. “I am for making sure that every single person there has every right to thrive.” In a subsequent interview Tuesday with In These Times magazine, Tlaib endorsed a one-state solution and supported the free speech rights of BDS activists, who push for Boycotts, Divstment and Sanctions against Israel. “One state,” she said in response to a question about whether she supports a one- or two-state solution. “It has to be one state. Separate but equal does not work. I’m only 42 years old but my teachers were of that generation that marched with Martin Luther King. This whole idea of a two-state solution, it doesn’t work.” That position runs counter to the stance of JStreetPac, a group affiliated with the liberal Middle East policy organization, which has endorsed her. According to her candidate page on JStreetPac’s website, Tlaib “believes that the US should be directly involved with negotiations to reach a two-state solution.” J Street told JTA on Wednesday that it is seeking clarification from Tlaib’s campaign regarding her recent statements. “JStreetPAC was created to demonstrate the wellspring of political support that exists for candidates who take pro-Israel, pro-peace positions, chief among them, support for a two-state solution to the Israeli-Palestinian conflict,” read the statement to JTA from Jessica Rosenblum, J Street’s senior vice president of public engagement. “We are clear and unequivocal with all the candidates who we consider for endorsement what our core principles and commitments are. We only endorse candidates who have affirmed support for them.” In response to Tlaib’s Monday statement on military aid on Britain’s Channel 4, the Jewish Democratic Council of America cited joint military and missile programs the following day in describing the U.S.-Israel relationship as “mutually beneficial.” It said that “threatening to cut military assistance to Israel is inconsistent with the values of the Democratic Party and the American people.” The group pledged to engage with Tlaib and explain why “U.S. military aid to Israel is a national security priority.” The statement came out at approximately the same time as Tlaib’s In These Times interview and does not reference it. Speaking to the Huffington Post last week, Tlaib said she was “going to be a voice” for her family in the West Bank, with whom she maintains close ties, saying that she wanted to break down barriers between Israelis and Palestinians, “two people who have so much more in common.” “I look forward to being able to humanize so many of them that have felt ‘less than’ for so long,” she said. Like Tlaib, Omar has been critical of Israel and once called it an “apartheid regime.” More recently she came out against the boycott Israel movement, Omar, a Somalia-born community activist and representative in the State House, is favored to win in November in the 5th District now held by Keith Ellison, who won the DFL primary for state attorney general. DFL is the state’s Democratic Party. Ellison was the first Muslim elected to Congress, in 2006. Source: Michigan Democrat Tlaib Endorses One-State Solution – The Forward

    Read at 09:23 pm, Aug 15th

  • Trump’s Perfidy, Israel’s Self-Own - The Atlantic

    Even President Donald Trump, who has tweeted racist attacks against Tlaib and Omar and called Democrats an “anti-Israel, anti-Semitic party,” managed to divert his mania in other directions for a few weeks. It was too good to last. This morning, the Israeli government reversed itself with an announcement that it would not permit Tlaib and Omar to visit. Even before it was official, Trump, on cue, uncorked the gasoline, tweeting: “It would show great weakness if Israel allowed Rep. Omar and Rep. Tlaib to visit. They hate Israel & all Jewish people, & there is nothing that can be said or done to change their minds. Minnesota and Michigan will have a hard time putting them back in office. They are a disgrace!” Read: Trump has enabled Israel’s antidemocratic tendencies at every turn Cards on the table: I strongly disagree with Tlaib’s and Omar’s critiques of Israel. Their sympathy for Palestinians is understandable. I share it and have long supported a two-state solution that ends the occupation and gives Palestinians sovereignty and Israel security. But their views go much further. They call into question the legitimacy of Israel in any borders. They have voiced support for the BDS movement, which advocates the end of Israel, rather than the establishment of two states, and assigns all responsibility for this conflict, in all its historical complexity, to Israel alone. Omar tweeted anti-Semitic tropes, for which she later apologized. My eyes are open about Israel’s contributions to the stalemate: the expansion of settlements, demolitions of Palestinian homes, and open calls for annexation of the West Bank. Palestinian leaders who don’t educate their people to accept Israel’s legitimacy and permanence and give tacit or explicit backing to violence also bear responsibility. And Trump walking away from long-standing U.S. support for two states has emboldened the rejectionists in both camps. If Tlaib and Omar did not plan to use their trip to hear a broad range of views on both sides, then they were missing the opportunity to see things they have never seen, learn from people they have never met, and gain new insights into the lives of Israelis and Palestinians and pathways to resolve their endless conflict. But the Israeli government, which has too often taken sides in America’s partisan wars of late, initially made the smart, strategic decision. It prioritized the bipartisan foundations of the U.S.-Israel relationship, the understanding that the Congress represents U.S. citizens and taxpayers who, year after year, provide major financial support to Israel’s ability to defend itself. They were willing to endure a visit in which critical voices would be heard, just as a mature, confident democracy should. What happened? There are only two possible explanations for this about-face. Source: Trump’s Perfidy, Israel’s Self-Own – The Atlantic

    Read at 09:07 pm, Aug 15th

  • Rep. Ilhan Omar is funded by Israel-hating BDS promoters and PACs | JNS.org

    Morton A. Klein and Elizabeth A. Berney The list is long, and her anti-Semitic slurs are significant. Nevertheless, Congress supports Israel because the American people support Israel; both nations share democratic values and strategic interests. (February 15, 2019 / JNS) Israel-hating Congresswoman Ilhan Omar (D-Minn.) has repeatedly displayed deep-seated, dangerous anti-Semitism and hatred towards the Jewish people. Rep. Omar falsely called Israel “evil” and an “apartheid regime.” She promotes anti-Israel boycotts, divestment and sanctions (BDS)—the delegitimization and economic destruction of Israel—after lying to voters about her BDS position while running for Congress. Rep. Omar sought reduced sentences for ISIS terrorists, a position that endangers every American. And she’s never apologized for any of these issues. The Democratic House leadership’s recent condemnations of Rep. Omar are an important first step, but they are not enough, especially because she immediately backtracked on her phony “non-apologies” by again defaming Jews and Israel. From the outset, the Zionist Organization of America urged that Ilhan Rep. Omar must be removed from the House Foreign Relations Committee, which is a position that enables her to harm U.S.-Israel relations and mainstream her virulent anti-Semitism. We are thankful that this week, U.S. President Trump called for Rep. Omar’s ouster from the committee, at a minimum. Rep. Omar’s most recent anti-Semitic tweets—that a Jewish lobby organization (AIPAC) and money cause Congress to support Israel—are woefully wrong. Congress supports Israel because the American people, Jewish and non-Jewish, support Israel; it’s the moral thing to do, and both nations share democratic values and strategic interests. Rep. Omar’s anti-Semitic “Jewish money” tweets prompted us to examine her own funders. ZOA’s analysis of Rep. Her Federal Election Commission (FEC) filings found that her financial supporters include a veritable “Who’s Who” of anti-Israel propagandists and haters and anti-Israel PACs. For instance: CAIR (Council on American-Islamic Relations) is one of Rep. Omar’s top 20 contributors. CAIR is an unindicted co-conspirator in the Holy Land Foundation terror financing trial, for funneling money to Hamas.  FBI testimony reportedly indicated that CAIR is a Hamas front group. IPT detailed CAIR leaders’ statements supporting Hamas, and the HLF trial judge’s ruling that the government’s evidence created “at least a prima facie case as to CAIR’s involvement in a conspiracy to support Hamas.” CAIR-CA PAC gave her $5,000. In addition, CAIR-CA’s executive director, Hussam Ayloush, who called for Israel’s “termination,” gave Ilhan Omar $1,200. A CAIR Florida employee gave Rep. Omar another $500. James Zogby, president of the anti-Israel Arab-American Institute (AAI), chairman of the anti-Israel Palestine Human Rights Campaign, and a major anti-Israel propagandist, gave Rep. Omar $2,700. Zogby falsely accused Israel of committing a “Holocaust” against Palestinians; called Israelis “Nazis”; campaigned to prevent the extradition to Israel of a Fatah terrorist who killed two Israeli teenagers and wounded 36 other Israelis; called Cuban-American Congresswoman Ileana Ros-Lehtinen an “Israel-firster” (an anti-Semitic trope implying dual loyalty); praised the intifada as a “good story”; and was a leading architect of propaganda themes used to pry progressive Jews away from supporting Israel. Rep. Rashida Tlaib (D-Mich.), who wrapped herself in the flag of Mahmoud Abbas’s murderous, hostile, Palestinian-Arab regime to celebrate her election win, sent Rep. Omar three contributions. Tlaib also supports replacing Israel with a Palestinian Arab state, repeatedly hurls false accusations at Israel, says that Israel doesn’t deserve human rights, and invoked anti-Semitic disloyalty-to-America smears against U.S. senators who sponsored an anti-BDS bill.  A professor from California Islamic University, which “arm[s] [students] with tools needed to spread Islam wherever they go,” donated to Rep. Omar. The Soros-funded MoveOn.Org, which attempted to kill the pro-Israel anti-BDS bill and essentially blackmailed congresspersons to support the disastrous Iran deal, gave Rep. Omar $5,000 and was one of hers top 20 contributors. MoveOn.org also co-founded Avaaz, which initiates extremely offensive, falsehood-filled anti-Israel campaigns, including campaigns to release Palestinian Arab terrorist Ahed Tamimi, and for Ireland’s dangerous BDS Bill. A $500 donor to Rep. Omar showed his wife wearing Hamas scarves and put on his Facebook profile, stating in Arabic: “Jerusalem is ours, WE ARE COMING!” and include maps in the shape of Israel over-written with the words: “Palestine” and “From the river to the sea!” (the well-known phrase calling for Israel’s extinction). OneVoice gave Rep. Omar $1,000. The Senate Permanent Subcommittee on Investigations’ 2016 bipartisan staff report found that OneVoice used campaign infrastructure built with $350,000 of taxpayer money, granted by President Obama’s State Department, to interfere with Israel’s election and fund efforts to oust Israeli Prime Minister Benjamin Netanyahu. Bend the Arc Jewish Action Inc. PAC, founded and chaired by anti-Zionist George Soros’s son, Alex Soros, gave Rep. Omar $1,000. Blogger Victor Rosenthal explained that Bend the Arc funnels Jewish contributions to left-wing politicians who support anti-Israel policies, without mentioning Israel, to enable left-leaning Jewish contributors to donate to anti-Israel politicians (such as Ilhan Omar) without feeling guilty about betraying their people.  Debbie (Dhabah) Almontaser, who defended an Arab Women’s group for hawking “Intifada NYC” T-shirts that glorify Palestinian-Arab terror, gave Omar $500. Rep. Omar’s most recent non-apology added: “At the same time, I reaffirm the problematic role of lobbyists in our politics, whether it be AIPAC, the NRA or the fossil fuel industry.” Hypocritically however, Rep. Omar accepted oil money from Beowulf Energy’s principal Nazar Khan ($2,000); Giant Oil president Basem Ali ($2,000); and an ExxonMobil specialist. She also hypocritically accepted approximately $190,000 of PAC money. Disappointingly, “Nancy Pelosi for Congress,” previously gave Rep. Omar $2,000. Now that Rep. Omar has repeatedly displayed her deep-seated anti-Semitism, Speaker Pelosi has condemned those comments, saying “Omar’s use of anti-Semitic tropes and prejudicial accusations about Israel’s supporters is deeply offensive.” We hope Speaker Pelosi will discontinue financially supporting Rep. Omar and immediately remove her from the House Foreign Relations Committee. Rep. Omar must be isolated, defunded and removed from positions—meaning from every committee where she can carry out the extremist anti-Israel anti-American agenda of those who are backing her financially. Morton A. Klein is the national president of the Zionist Organization of America. Elizabeth A. Berney is ZOA’s director of special projects. Subscribe to The JNS Daily Syndicate by email and never miss our top stories Source: Rep. Ilhan Omar is funded by Israel-hating BDS promoters and PACs | JNS.org

    Read at 08:55 pm, Aug 15th

  • ‘From The River To The Sea’ Doesn’t Mean What You Think It Means – The Forward

    Over the weekend, scholar and social justice activist Marc Lamont Hill apologized for ending his recent remarks at United Nations by calling for “a free Palestine from the river to the sea.” His apology came after three days of furious online attacks and criticism from many people who felt deeply hurt by his remarks. Critics have pointed to Hamas’s use of this phrase to claim that Hill was either deliberately parroting a Hamas line that calls for Israel’s elimination, or at the very least ignorantly repeating a deeply offensive and triggering phrase. Opinion | ‘From The River To The Sea’ Doesn’t Mean What You Think It Means Yet lost in all these discussions is any acknowledgement of what this phrase actually means — and has meant — to Palestinians of all political stripes and convictions. As a Palestinian American and a scholar of Palestinian history, I’m concerned by the lack of interest in how this phrase is understood by the people who invoke it. It helps to remember the context in which Hill delivered his original remarks. He made them last Wednesday as part of the Special Meeting of the Committee on the Exercise of the Inalienable Rights of the Palestinian People, in observance of the United Nations International Day of Solidarity With the Palestinian People, which is held during the final days of November each year. The date is important. On November 29th, 1947, the United Nations General Assembly voted to partition Palestine into a Jewish state and an Arab state. While Jews in Palestine rejoiced, the country’s Arabs bitterly opposed the partition plan. The reason was that they saw all of Palestine — from the river to the sea — as one indivisible homeland. They invoked the story of Solomon and the baby to explain their stance. Like the real mother in the parable, who begged Solomon to refrain from splitting her baby in half, Palestinian Arabs couldn’t stand to see their beloved country split in two. And they saw the Zionists’ eager reception of the plan as an ominous sign that they intended to conquer the whole of Palestine. Moreover, the proposed borders of the two states meant that the Jewish state would have roughly 500,000 Palestinians living in it as a minority. And while the Israeli narrative holds that those Palestinians would have been welcomed as equals in the new Jewish state, the clashes between Jews and Arabs in Palestine that followed the UN vote, particularly the attacks by Zionist militants and the subsequent forcible removal of Palestinians from their homes and lands in areas allotted to the Jewish state, led Palestinians to conclude otherwise. As for those Palestinians who managed to remain on their lands in the new Israeli state, they were eventually granted citizenship, but it was clearly subordinate to the status of Jewish Israelis. They were subject to military rule rather than civilian law, which meant they needed permits from the military governor to travel to work and school. They also encountered widespread prejudice from Israelis who saw them as a benighted, traditional underclass in need of the state’s benevolent modernization. Opinion | ‘From The River To The Sea’ Doesn’t Mean What You Think It Means And Palestinians in the West Bank and Gaza Strip, living under Jordanian and Egyptian rule respectively, faced authoritarian crackdowns that prevented them from being able to fully express their political views. In other words, after 1948, Palestinians were not able to live with full freedom and dignity anywhere in their homeland. That’s how the call for a free Palestine “from the river to the sea” gained traction in the 1960s. It was part of a larger call to see a secular democratic state established in all of historic Palestine. Palestinians hoped their state would be free from oppression of all sorts, from Israeli as well as from Arab regimes. To be sure, a lot of Palestinians thought that in a single democratic state, many Jewish Israelis would voluntarily leave, like the French settlers in Algeria did when that country gained its independence from the French. Their belief stemmed from the anti-colonial context in which the Palestinian liberation movement arose. That’s why, despite the occasional bout of overheated rhetoric from some leaders, there was no official Palestinian position calling for the forced removal of Jews from Palestine. This continued to be their position despite an Israeli media campaign following the 1967 war that claimed Palestinians wished to “throw Jews into the sea.” Opinion | ‘From The River To The Sea’ Doesn’t Mean What You Think It Means While Palestinians viewed Zionists as akin to colonial settlers, Jews who were willing to live as equals with the Palestinians were welcome to stay. In his 1974 speech to the UN, Fatah leader and PLO Chairman Yasser Arafat declared, “when we speak of our common hopes for the Palestine of tomorrow we include in our perspective all Jews now living in Palestine who choose to live with us there in peace and without discrimination.” In the 1980s and ‘90s, Fatah and the PLO changed their official stance from calling for a single state to supporting a two-state solution. Many Palestinians —particularly the refugees and their descendants — saw them as abandoning the core of their homeland and acquiescing to colonial theft. They were allowing the proverbial baby to be split. With Fatah seen as selling out, Hamas picked up the call for a free Palestine “from the river to the sea.” It sought to burnish its own anti-colonial bona fides at the expense of Fatah. And although many people point to Hamas’s 1988 charter as evidence of its hostility to Jews, in fact the group long ago distanced itself from that initial document, seeking a more explicit anti-colonial stance. Moreover, its 2017 revised charter makes even clearer that its conflict is with Zionism, not with Jews. And notwithstanding the extreme rhetoric of some leaders on both sides, a recent joint poll shows that only a small minority of Palestinians see “expulsion” as a solution to the conflict – 15% — which is incidentally the same percentage of Israelis who view this as the only solution. Opinion | ‘From The River To The Sea’ Doesn’t Mean What You Think It Means What Palestinians do want is equal rights. They want to be able work hard to achieve their dreams without being discriminated against. They want to be able to live where they choose without being told they can’t because of their ethnicity or religion. They want to be able to choose the leaders who control their lives. In other words, they want freedom. And they want that freedom throughout their historic homeland, not just on the 22% that comprise the West Bank and Gaza Strip. This desire for freedom is what Marc Lamont Hill was invoking when he called for “a free Palestine from the river to the sea.” His remarks were intended to center Palestinians’ aspirations, not disparage Israelis’. This has been lost on his critics, which speaks to a larger problem. Opinion | ‘From The River To The Sea’ Doesn’t Mean What You Think It Means Dismissing or ignoring what this phrase means to the Palestinians is yet another means by which to silence Palestinian perspectives. Citing only Hamas leaders’ use of the phrase, while disregarding the liberationist context in which other Palestinians understand it, shows a disturbing level of ignorance about Palestinians’ views at best, and a deliberate attempt to smear their legitimate aspirations at worst. Most troubling for me, the belief that a “free Palestine” would necessarily lead to the mass annihilation of Jewish Israelis is rooted in deeply racist and Islamophobic assumptions about who the Palestinians are and what they want. Rather than just lecture Palestinians and their supporters about how certain phrases make them feel, supporters of Israel should get more curious about what Palestinians themselves want. There isn’t a single answer (there never is), but assuming you already know is no way to work towards a just and lasting peace. Maha Nassar is an Associate Professor in the School of Middle Eastern and North African Studies at the University of Arizona and a 2018 Public Voices Fellow with the OpEd Project. Follow her on Twitter @mtnassar. The views and opinions expressed in this article are the author’s own and do not necessarily reflect those of the Forward. Source: ‘From The River To The Sea’ Doesn’t Mean What You Think It Means – The Forward

    Read at 08:32 pm, Aug 15th

  • The npm Blog — npm CLI Roadmap - Summer 2019

    Motion on the npm CLI project has been accelerating, and we’re now moving forward with a clear direction and vision. This document outlines what’s in store for the remainder of the npm v6 line, and what to expect in v7 and v8. Remaining npm v6 Releases npm v6 is officially in “bugfix and minor enhancement” mode as work on npm v7 is getting into full swing. That doesn’t mean that improvements won’t be made! But the architectural changes for v7 will require quite a bit of attention, and will be the priority moving forward. Expect weekly-to-biweekly releases to update dependencies and land pull requests, but major new features will likely be held off until v7. These releases will slow down precipitously once v7 is closer to release, but we do expect to continue to fix major bugs as long as people continue using it. Bugfixes will focus on the most pervasive and persistent problems that users experience. Files with incorrect ownership (root files in user-owned folders, and user-owned files in root-owned folders, especially). Long-standing issues with the uid-number and cmd-shim modules. Any low-hanging fruit to make WSL work better than it currently does. Ensure that the urls stored in package-lock.json are always https when they’re supposed to be. (This is just a cosmetic issue, since the CLI always requests via HTTPS to the public registry, and the registry redirects to HTTPS, but it’s one that spooks people a lot.) Inconsistencies in npm ci (other than those that require significant rework, since v7 will replace this codebase anyway). New features likely to land on the 6.x branch: Add support for the peerDependenciesMeta field. (#224) Add support for bundleDependencies: true (npm/read-package-json#81) Some other accepted rfcs, provided that they don’t involve breaking changes or conflict with other plans here. Generally, not much in the way of new features or major changes. Bugfixes will continue, but anything that requires significant behavior or structural change will probably be put off until the next major. npm v7 The headline theme for v7 is a refactor to the installer that adds consistency and speed, and makes lot of desirable features better and easier to implement. Installer Refactor Most of the installer logic is currently being factored out into @npmcli/arborist. This will enable teasing apart a lot of logic that has been developed organically over the years. The installer in the npm CLI itself will just handle passing arguments to @npmcli/arborist and listening to events to update the progress bar. This will bring a lot of consistency to the various sorts of installation tasks (prune, update, etc.) as well as making package-lock.json more deterministically generated. It incorporates and replaces read-package-tree, read-logical-tree, libcipm, and almost all of the code in the CLI’s lib/install/ folder. npm link The Arborist module is being designed with first-class support for symbolic links. This will fix a class of edge cases where a symlinked folder in node_modules results in strange behavior and incorrect “missing dependency” warnings. Links will be linked directly to their targets, instead of symlinking globally and then linking the global link into the local node_modules. npm ls <path> Currently, npm ls can take a package name, and show all the packages that depend on that package name. In v7, npm ls /logical/path or npm ls node_modules/file/node_modules/path will be able to show you all of the packages whose dependency is met by that specific instance of a package, more clearly answering the “why is this here?” question, without resorting to digging through npm’s metadata manually. Workspaces First-class support for symbolic links means that workspaces will be trivial to implement. npm v7 will have at least the Workspace feature support of Yarn, and will set the stage for more advanced workspace features in v8. At minimum, you’ll be able to keep multiple related packages all together in a single repository, and test changes in an integrated environment, without continually re-linking. Once Workspaces land, it’ll be possible to add more advanced workspace management features. For example, building, versioning, managing permissions, and publishing all the packages within a workspace together with a single command. That likely won’t be in 7.0, and may be pushed out until v8, depending on feedback and user demand. Proper peerDependencies Support Part of the installer rewrite involves taking a fresh look at how peerDependencies are handled (and, quite often, not handled) to maximize the cases where the CLI does the right thing. Since npm v3, peerDependencies were not installed by default, putting the onus on users to resolve them. We’ll be bringing back this functionality in version 7. This is one of the trickiest (not merely typing-intensive) parts of the refactor, because peerDependencies can actually conflict with one another in ways that create dependency hell. See npm/rfcs#43 for a discussion of how this will be implemented. Resolution Overrides Let’s say your project depends on foo, which depends on bar@1.0.0. There are sometimes cases where you need to give it a different version of bar than the one that foo requests in its package.json. For example, this can be useful to float security or bug fix patches, or to more aggressively deduplicate for the benefit of browser or serverless environments where memory is at a premium. Resolution overrides give you a new tool to get out of these tricky scenarios without having to resort to manually mucking about in node_modules. Play Nicer with Yarn Right now, teams that switch between npm and Yarn (or use both!) have a lot of strife. npm reads and creates package-lock.json. Yarn reads and creates yarn.lock. These files easily get out of sync, and users get scolded by the tools that are supposed to be helping them. That’s not The npm Way. npm v7 will treat yarn.lock as a valid source of information. If it sees a yarn.lock file, it’ll keep it up to date when modifying dependencies. If it sees a yarn.lock and no package-lock.json, it’ll use that as the authoritative source of dependency resolution. Also, Arborist will do its best to read the contents of a node_modules folder, even if it wasn’t the one to put it there. This means that npm v7 will not add extra metadata to installed package.json files. Since we won’t depend on it, there’s no need for it. More Intuitive Handling of files Array in package.json Traditionally, npm has treated the files list in package.json as a sort of “un-ignore” list. That is, it gets logically treated as if it was a .npmignore file with * at the top, and ! prepended to all of the items. This results in some odd edge cases, like including .DS_Store files if they are in an included folder, and has made it harder to strip out unwanted files like core dumps without having adverse side effects. In version 7, npm-packlist will be changed to treat "files" specially, so that they can more comfortably coexist with ignore rules (including default-ignored files like .DS_Store and core dumps). Drop Support for Node.js v6 Node.js v6 reached its end of life on 2019-04-30, a few months ago now. Dropping support for Node 6 means that the CLI can make use of more advanced features, up to date modules, and keep up with the usage patterns of our community. Other v7 Things There are some other nice to have things that might make it into this release. Do away with the unsafe-perm and uid-switching issue entirely, as it causes more and worse problems than it solves. If someone’s running as root, just go ahead and run scripts as root Revisit the npm org and npm team commands to improve their usability and reliability. Timing Version 7 doesn’t have a release date yet, and most of the finer points here are still subject to change. The installer refactor is underway, and will be the big breaking change that requires a SemVer major bump. If it gets to a stable releasable state before Workspaces and other features land, we will likely decide to release multiple iterations rather than try to squeeze everything into the 7.0.0 release. Initial release will almost certainly be this year, possibly as early as this fall. Beyond Arborist, to Tink: npm v8 Version 8 will primarily be about integrating Tink. See Kat’s talk about Tink for more information about the whys and whats of that change. The Arborist refactor will provide a much more well-understood and stable footing for Tink to be merged in. The tl;dr on Tink is: Dependencies don’t actually get unpacked in node_modules unless they have an install script, or an unwind is specifically requested. Instead, they are just added to the cache, and a package-lock.json or npm-shrinkwrap.json is generated. A fallback-fs overrides the Node.js fs module whenever a script is run via npm. This allows node programs to be served the files out of the npm cache for the specific dependency it would resolve to, instead of out of node_modules, without modifying anything else further up the stack. Of course, it won’t work for cases where you want to run node server.js, but a new npm sh command will support running arbitrary commands as if they were an npm script. Tink is still very experimental, and it’s 2 major releases out, so expect a lot of this to change in the finer details. Source: The npm Blog — npm CLI Roadmap – Summer 2019

    Read at 02:42 pm, Aug 15th

  • Proposal to Auto-Update Old Versions of WordPress to 4.7 Sparks Heated Debate – WordPress Tavern

    WordPress contributors, developers, and community members are currently debating a proposal to would implement a new policy regarding security support for older versions. The discussion began last week when security team lead Jake Spurlock asked for feedback on different approaches to backporting security fixes to older versions. Following up on this discussion, Ian Dunn, a full-time contributor to WordPress core, sponsored by Automattic, has published a proposal for moving forward with a new policy: Support the latest 6 versions, and auto-update unsupported sites to the oldest supported version. That would mean that the currently supported versions would be 4.7 – 5.2, and the 3.7 – 4.6 branches would eventually be auto-updated to 4.7. In practice, that’d provide roughly 2 years of support for each branch, and roughly 10% of current sites would eventually be auto-updated to 4.7. Once 5.3 is released, the oldest supported version would be become 4.8. Dunn outlined a detailed plan for implementing the new policy that involves testing a small subset of sites to identify problems before gradually updating older sites from one major version to the next (not all at once). Site administrators would be notified at least 30 days prior to the automatic updates with emails and notices in the admin that would also offer the opportunity to opt out. The proposal has received dozens of comments, with some contributors in support, some in favor of modifications to the rollout, and others who are unequivocally opposed to the idea of auto-updating old sites to major versions. One of the prevailing concerns is that many admins will not receive any notice due to non-functioning email addresses or not logging into their admin dashboards frequently enough. Opponents also contend that even though there are fallbacks for sites that fail to upgrade, some sites may be broken in a way that WordPress cannot detect, due to problems with plugins or themes. “A back-end notice will not even begin to make up for the lack of reliable email communication,” Glenn Messersmith said. “There are tons of site owners who never venture into the back-end once their site has been developed. These are the very people who will not get email notifications either because the email address is that of some long gone developer. “There is no way any sort of error detection can act as a safety net for those who never saw any notifications. There are all sorts of ways that a site owner might consider their site to be ‘broken’ which an update script could not possibly detect.” In response to concerns about abandoned sites breaking or administrators relying heavily on a plugin that has been abandoned, Dunn agreed that these types of situations may be unavoidable under the current proposal. “I can definitely sympathize with that situation, but we have to draw the line somewhere,” Dunn said. “We don’t have unlimited resources, and the current policy has damaging effects for the entire WordPress ecosystem. “In reality, choices are never between a purely good thing and a purely bad thing; they’re always between competing tradeoffs. “I definitely agree that it’s bad if a small number of site owner have to do extra work to upgrade their site, but in the grand scheme of things, that’s much, much better than having our security team be hindered by an extremely onerous support policy.” Proposal Author Claims “Nobody Would be Forced to Update;” Opponents Argue that Requiring Users to Opt Out is Not Consent In addition to the problem of possibly breaking sites, those opposed to the proposal are not on board with WordPress forcing an update without getting explicit consent from site administrators. Providing users a way to opt into automatic updates for major core releases is one of the nine projects that Matt Mullenweg had identified for working on in 2019. However, the plan for this proposal is more aggressive in that it would require site owners on the 3.7 – 4.6 branches to opt out if they do not want to be incrementally auto-updated to 4.7. “They still retain agency no matter what, nobody would be forced to update, everybody retains control over their site and can opt-out if they want to,” Dunn said. “Something being on by default is very different from forcing somebody to do something. We would make it very easy to opt out — just install a plugin, no config required — and the instructions for opting out would be included in every email and admin notice.” Dunn further clarified in a comment regarding who would receive these updates: Nobody would be forced, it would instead be an opt-out process. If someone has already disabled auto-updates to major versions, that would be respected and their site would not be updated. If someone clicked the opt-out link in the email, or if they clicked the opt-out button in the admin notice, then the updates would also be disabled. The only people who would receive the updates are the ones who: 1) Want the update 2) Don’t care 3) Have abandoned their sites or email accounts Several participants in the discussion asked why the process of getting these sites on 4.7 cannot be opt-in for consent, instead of forcing the update on those who don’t opt out. No matter how convenient the opt-out mechanism is, having one in place doesn’t constitute consent. Many site owners who will be forced into this process thought they would be safe in opting for maintenance and security updates and leaving their sites to perform “updates while you sleep,” as the 3.7 release post described the feature. “Insecure sites are bad, but arguably, retrospectively enlarging the power granted to oneself by this mechanism is worse,” UpdraftPlus creator David Anderson said. “Potentially it could damage trust + reputation more than insecurity. I’d argue that huge dashboard ugly, irremovable notices on older versions warning of upcoming abandonment + the need to update would be better. Let the site owner take responsibility. Don’t play nanny, abuse trust, break sites and then write blog posts about how it was necessary collateral damage. Nobody who wakes up to a broken site will be happy with that.” Andrew Nacin, WordPress 3.7 release lead and co-author of WordPress’ automatic background updates feature, encouraged those behind the proposal to clarify that WordPress only supports the latest major version and has never officially supported older versions. “It takes a lot of work, for sure, to backport,” Nacin said. “But we should still stick to our north star, which is that WordPress is backwards compatible from version to version, that WordPress users shouldn’t need to worry about what version they are running, and that we should just keep sites up to date if we are able.” Nacin offered more context on the original strategy for introducing automatic updates, which included gradually moving to having major releases as auto updates so all sites would eventually be on the latest version: First, when we first released automatic background updates, we thought that our next big push would be to get to major release auto updates in the next few years. In practice, we can do this at any time, and, indeed, 3.7 supported this as a flag. But the idea was we would invest energy in sandboxing, whitescreen protection, improving our rollback functionality, etc., so our success rate was as high for major versions as it was for minor versions. (The failure rate scales somewhat linearly with the number of files that need to be copied over, and also gets more complex when files need to be added, rather than just changed.) Once we did this, we’d simply start updating all sites to the latest version and stop backporting. Obviously we still haven’t gotten here. He commented that overall the proposal is “a great plan” but emphasized the benefits of communicating to users that it is safe to update and that WordPress only intends to support the latest version. Most participants in the discussion are in favor of the security team discontinuing backporting fixes to older versions of WordPress. The question that remains unanswered for opponents is why is it WordPress’ responsibility to force older sites to update. “I don’t think it should be WordPress’ decision to update sites that they don’t manage to major/breaking versions, but I think maintaining those branches should be stopped,” Will Stocks said. “You (WordPress) don’t own the infrastructure or business processes, or understand the support in place to manage those sites. There is also a reason those sites are still on that version today and have not upgraded past.” There are other approaches that can still draw a line to respect the security team’s limited resources without forcing any non-consensual updates to major versions. Rachel Cherry, director of WPCampus, commented on the proposal, strongly urging WordPress to establish consent before updating these sites: We are getting into the weeds of whether or not forced updates will cause tech issues and missing the real problem altogether. We are discussing force updating people’s software when they have not given consent. And for what end? What is the real problem here? Because we don’t want to worry about updating old versions? There are other ways to solve this problem. We can make a clear policy regarding EOL support for releases. We can add a setting to core that lets the user choose whether or not they want auto updates and going forward that is the decision maker. Then we have consent. We can work on education and communication regarding updates. We can email people that their site is outdated and insecure and they should update ASAP, along with links to education and best practices. If they still need help, encourage them to reach out to a professional. We can fix this problem for going forward, but we do not have implied retroactive consent just because we never put a permission mechanism in place. If someone didn’t update their site, they did so for a reason. Or indifference. Either way, we have no right to go in like this and modify people’s websites. Participants in the discussion are still wrestling with the potential implications of the proposed policy change. Minor updates have proven to be very reliable as auto-updates. Dunn reported that the 3.7.29 auto-update had only one failure that had to be rolled back to 3.7.28. Using the auto update system to push major updates to sites as old as these has not yet been thoroughly tested. “Whether or not we do auto-update the 3.7 -> 5.x releases, I fully support making it clear that this is something we expect to start doing for the future (5.x -> x.x+),” Jeremy Felt commented on the proposal. “The work on testing infrastructure and code to support this should absolutely be done either way.” Felt also said he appreciated the staggered rollout scheduling for the proposed releases as well as the plan to provide an officially supported plugin for disabling auto-updates. Discussion is still open on the proposal, but so far there seems to be a fundamental disagreement among participants about whether WordPress has the right to force major version updates without explicit consent, even if it is with the intention of saving site owners from potentially getting hacked. “One thing is for sure, it appears to be a majority concern so far, while many of us are fond of these noble intentions, I’m just not so sure being the benevolent overlord of the Internet is a good image for WP moving forward,” plugin developer Philip Ingram said. Would you like to write for WP Tavern? We are always accepting guest posts from the community and are looking for new contributors. Get in touch with us and let's discuss your ideas. Like this: Like Loading... Related Source: Proposal to Auto-Update Old Versions of WordPress to 4.7 Sparks Heated Debate – WordPress Tavern

    Read at 01:34 am, Aug 15th

  • React v16.9.0 and the Roadmap Update – React Blog

    React v16.9.0 and the Roadmap Update Today we are releasing React 16.9. It contains several new features, bugfixes, and new deprecation warnings to help prepare for a future major release. New Deprecations Renaming Unsafe Lifecycle Methods Over a year ago, we announced that unsafe lifecycle methods are getting renamed: componentWillMount → UNSAFE_componentWillMount componentWillReceiveProps → UNSAFE_componentWillReceiveProps componentWillUpdate → UNSAFE_componentWillUpdate React 16.9 does not contain breaking changes, and the old names continue to work in this release. But you will now see a warning when using any of the old names: As the warning suggests, there are usually better approaches for each of the unsafe methods. However, maybe you don’t have the time to migrate or test these components. In that case, we recommend running a “codemod” script that renames them automatically: cd your_project npx react-codemod rename-unsafe-lifecycles (Note that it says npx, not npm. npx is a utility that comes with Node 6+ by default.) Running this codemod will replace the old names like componentWillMount with the new names like UNSAFE_componentWillMount: The new names like UNSAFE_componentWillMount will keep working in both React 16.9 and in React 17.x. However, the new UNSAFE_ prefix will help components with problematic patterns stand out during the code review and debugging sessions. (If you’d like, you can further discourage their use inside your app with the opt-in Strict Mode.) Note Learn more about our versioning policy and commitment to stability. Deprecating javascript: URLs URLs starting with javascript: are a dangerous attack surface because it’s easy to accidentally include unsanitized output in a tag like <a href> and create a security hole: In React 16.9, this pattern continues to work, but it will log a warning. If you use javascript: URLs for logic, try to use React event handlers instead. (As a last resort, you can circumvent the protection with dangerouslySetInnerHTML, but it is highly discouraged and often leads to security holes.) In a future major release, React will throw an error if it encounters a javascript: URL. Deprecating “Factory” Components Before compiling JavaScript classes with Babel became popular, React had support for a “factory” component that returns an object with a render method: This pattern is confusing because it looks too much like a function component — but it isn’t one. (A function component would just return the <div /> in the above example.) This pattern was almost never used in the wild, and supporting it causes React to be slightly larger and slower than necessary. So we are deprecating this pattern in 16.9 and logging a warning if it’s encountered. If you rely on it, adding FactoryComponent.prototype = React.Component.prototype can serve as a workaround. Alternatively, you can convert it to either a class or a function component. We don’t expect most codebases to be affected by this. New Features Async act() for Testing React 16.8 introduced a new testing utility called act() to help you write tests that better match the browser behavior. For example, multiple state updates inside a single act() get batched. This matches how React already works when handling real browser events, and helps prepare your components for the future in which React will batch updates more often. However, in 16.8 act() only supported synchronous functions. Sometimes, you might have seen a warning like this in a test but could not easily fix it: An update to SomeComponent inside a test was not wrapped in act(...). In React 16.9, act() also accepts asynchronous functions, and you can await its call: This solves the remaining cases where you couldn’t use act() before, such as when the state update was inside an asynchronous function. As a result, you should be able to fix all the remaining act() warnings in your tests now. We’ve heard there wasn’t enough information about how to write tests with act(). The new Testing Recipes guide describes common scenarios, and how act() can help you write good tests. These examples use vanilla DOM APIs, but you can also use React Testing Library to reduce the boilerplate code. Many of its methods already use act() internally. Please let us know on the issue tracker if you bump into any other scenarios where act() doesn’t work well for you, and we’ll try to help. Performance Measurements with <React.Profiler> In React 16.5, we introduced a new React Profiler for DevTools that helps find performance bottlenecks in your application. In React 16.9, we are also adding a programmatic way to gather measurements called <React.Profiler>. We expect that most smaller apps won’t use it, but it can be handy to track performance regressions over time in larger apps. The <Profiler> measures how often a React application renders and what the “cost” of rendering is. Its purpose is to help identify parts of an application that are slow and may benefit from optimizations such as memoization. A <Profiler> can be added anywhere in a React tree to measure the cost of rendering that part of the tree. It requires two props: an id (string) and an onRender callback (function) which React calls any time a component within the tree “commits” an update. To learn more about the Profiler and the parameters passed to the onRender callback, check out the Profiler docs. Note: Profiling adds some additional overhead, so it is disabled in the production build. To opt into production profiling, React provides a special production build with profiling enabled. Read more about how to use this build at fb.me/react-profiling. Notable Bugfixes This release contains a few other notable improvements: A crash when calling findDOMNode() inside a <Suspense> tree has been fixed. A memory leak caused by retaining deleted subtrees has been fixed too. An infinite loop caused by setState in useEffect now logs an error. (This is similar to the error you see when you call setState in componentDidUpdate in a class.) We’re thankful to all the contributors who helped surface and fix these and other issues. You can find the full changelog below. An Update to the Roadmap In November 2018, we have posted this roadmap for the 16.x releases: A minor 16.x release with React Hooks (past estimate: Q1 2019) A minor 16.x release with Concurrent Mode (past estimate: Q2 2019) A minor 16.x release with Suspense for Data Fetching (past estimate: mid 2019) These estimates were too optimistic, and we’ve needed to adjust them. tldr: We shipped Hooks on time, but we’re regrouping Concurrent Mode and Suspense for Data Fetching into a single release that we intend to release later this year. In February, we shipped a stable 16.8 release including React Hooks, with React Native support coming a month later. However, we underestimated the follow-up work for this release, including the lint rules, developer tools, examples, and more documentation. This shifted the timeline by a few months. Now that React Hooks are rolled out, the work on Concurrent Mode and Suspense for Data Fetching is in full swing. The new Facebook website that’s currently in active development is built on top of these features. Testing them with real code helped discover and address many issues before they can affect the open source users. Some of these fixes involved an internal redesign of these features, which has also caused the timeline to slip. With this new understanding, here’s what we plan to do next. One Release Instead of Two Concurrent Mode and Suspense power the new Facebook website that’s in active development, so we are confident that they’re close to a stable state technically. We also now better understand the concrete steps before they are ready for open source adoption. Originally we thought we would split Concurrent Mode and Suspense for Data Fetching into two releases. We’ve found that this sequencing is confusing to explain because these features are more related than we thought at first. So we plan to release support for both Concurrent Mode and Suspense for Data Fetching in a single combined release instead. We don’t want to overpromise the release date again. Given that we rely on both of them in production code, we expect to provide a 16.x release with opt-in support for them this year. An Update on Data Fetching While React is not opinionated about how you fetch data, the first release of Suspense for Data Fetching will likely focus on integrating with opinionated data fetching libraries. For example, at Facebook we are using upcoming Relay APIs that integrate with Suspense. We will document how other opinionated libraries like Apollo can support a similar integration. In the first release, we don’t intend to focus on the ad-hoc “fire an HTTP request” solution we used in earlier demos (also known as “React Cache”). However, we expect that both we and the React community will be exploring that space in the months after the initial release. An Update on Server Rendering We have started the work on the new Suspense-capable server renderer, but we don’t expect it to be ready for the initial release of Concurrent Mode. This release will, however, provide a temporary solution that lets the existing server renderer emit HTML for Suspense fallbacks immediately, and then render their real content on the client. This is the solution we are currently using at Facebook ourselves until the streaming renderer is ready. Why Is It Taking So Long? We’ve shipped the individual pieces leading up to Concurrent Mode as they became stable, including new context API, lazy loading with Suspense, and Hooks. We are also eager to release the other missing parts, but trying them at scale is an important part of the process. The honest answer is that it just took more work than we expected when we started. As always, we appreciate your questions and feedback on Twitter and in our issue tracker. Installation React React v16.9.0 is available on the npm registry. To install React 16 with Yarn, run: yarn add react@^16.9.0 react-dom@^16.9.0 To install React 16 with npm, run: npm install --save react@^16.9.0 react-dom@^16.9.0 We also provide UMD builds of React via a CDN: <script crossorigin src="https://unpkg.com/react@16/umd/react.production.min.js"></script> <script crossorigin src="https://unpkg.com/react-dom@16/umd/react-dom.production.min.js"></script> Refer to the documentation for detailed installation instructions. Changelog React Add <React.Profiler> API for gathering performance measurements programmatically. (@bvaughn in #15172) Remove unstable_ConcurrentMode in favor of unstable_createRoot. (@acdlite in #15532) React DOM React DOM Server Fix incorrect output for camelCase custom CSS property names. (@bedakb in #16167) React Test Utilities and Test Renderer Source: React v16.9.0 and the Roadmap Update – React Blog

    Read at 12:55 am, Aug 15th

  • NPM employees worry about company future amid layoffs, resignations - Business Insider

    If you're reading this article, there's a solid chance you're enjoying the benefits of NPM without even knowing it. NPM provides a free, popular tool for coding in JavaScript, one of the world's most popular programming languages. With 11 million users, its free package registry powers much of the software that runs the internet and helps developers at companies like Uber and Spotify be more productive. But as happens so often in Silicon Valley, the usefulness and popularity of NPM's free service hasn't directly translated into blockbuster business success. The Oakland, California-based startup recently launched NPM Enterprise, a paid tool for businesses. But it's still largely reliant on the some $19 million in funding it's raised from firms such as Bessemer Venture Partners and True Ventures to keep the lights on for its 50 or so employees. About a year ago, NPM hired a new CEO, Bryan Bogensberger, an industry veteran who as CEO helped sell Inktank, an open-source-software business, to Red Hat for $175 million. While Bogensberger started in July 2018, his arrival wasn't officially announced until January. "Bringing Bryan on as CEO and the changes to the leadership of the company over the last year or so have really been reflective of transitioning the company away from a mode of reliance on venture capital and getting to where we are to be self-sustained," NPM cofounder and previous CEO Isaac Schlueter told Business Insider. NPM CEO Bryan Bogensberger.Inktank Under Bogensberger's short tenure, NPM has found itself in the spotlight. It was criticized for its handling of the sudden layoffs of five employees, representing about 10% of its staff, and then faced accusations from some of the laid-off employees that they were let go in retaliation for trying to unionize. Other employees signed a letter demanding better working conditions. And it has seen a string of high-profile resignations, including cofounder Laurie Voss'. In June, Bogensberger sent a message to employees saying NPM obtained funding from its board, allowing it to raise venture capital "without the threat of running out of money and with the full support of the board." Read more: Employees at NPM, a startup that provides a crucial service for 11 million software developers, have signed an open letter demanding better working conditions We spoke with 11 former NPM employees, all of whom left the company recently, who told us that behind the scenes, the company has been facing a crisis of morale, even as it goes through this critical phase of fundraising. NPM once prided itself on its motto of "compassion is our strategy" and joked that its name stood for "Nice People Matter," the employees said. Now the company is accused of creating an "atmosphere of fear" for employees amid a larger "pivot to greed." The former employees said management ignored their concerns, even as they were increasingly expected to work longer hours and navigate huge changes to their culture. Employees have also called for more transparency at the company. In the same open letter in which employees called for better working conditions, they lambasted management for not being more open about NPM's financial health. "The company won't trade ad hominem insults with anonymous critics who have an agenda based on specious claims and bitter emotions. Their criticism of Bryan Bogensberger is based on opinions, not facts, and is completely contradicted by what Bryan has accomplished over the course of his career," an NPM spokesperson said. At the heart of the culture clash at NPM may be a challenge that's as old as Silicon Valley itself. NPM was founded by technologists who wanted to build something that could change the world but eventually ran into cold, hard business reality. On one hand, these former employees worry that the new management is jettisoning everything that made them want to work there. On the other, the company's leadership has said that the NPM of old wasn't a sustainable business. If you can't generate revenue, your startup won't survive. Bogensberger said in a statement: An open source project needs more than passion to thrive. The sustainability of a free service like the npm registry requires people and infrastructure, and both are expensive. We define sustainability as the ability to build a viable business through proactive, value-adding community engagement, the development of mutually beneficial partnerships and the sale of commercial products that meet the needs of developers at startups and large corporations alike. You can read Bogensberger's full statement below the story. Business Insider also spoke with Schlueter and his cofounder Voss, who recently departed the company. The new boss NPM started its life as a collection of open-source-software evangelists and idealists working from the company's headquarters in downtown Oakland — right in the center of the city's growing tech scene, and with a view of Lake Merritt. Oakland, California. Sundry Photography/Shutterstock However, that changed in July 2018, when Schlueter, the cofounder, stepped aside as CEO to make way for Bogensberger, with the goal of helping the company figure out how to grow a business on top of its open-source foundations. Bogensberger then stocked the leadership team with people he had worked with at Inktank: Ross Turk became senior vice president of marketing and community, though he has since left, while Nigel Thomas became vice president of sales, and Danielle Womboldt became director of field marketing. Former employees told Business Insider it was clear from the get-go that Bogensberger was out of sync with much of the workforce. To several NPM employees, he was representative of the traditional Silicon Valley culture they had pushed against. Employees said that in its earliest days, NPM's motto of "compassion is our strategy" was taken very seriously, and the company had tried to make its culture more inclusive and thoughtful than at Silicon Valley tech giants like Facebook or Google. But former employees said they didn't enjoy working with Bogensberger, who they describe as "hot-headed," "anger-driven," "impulsive," and "a bully." Five former employees said Bogensberger was known for yelling during meetings, and two people said Bogensberger was known to "test" employees, meaning that they were given additional responsibilities with "unrealistic expectations" and told by management that their jobs depended on meeting those expectations. The latter concern was echoed in a letter sent to a manager several months into Bogensberger's tenure as CEO. (NPM declined to comment on the question of yelling during meetings and didn't directly address the question of "testing.") From its founding, NPM had prided itself on enforcing strict boundaries around work-life balance. But Bogensberger soon set aggressive product-release timelines that required some to work nights and weekends — something previously taboo — which contributed to an overall feeling of being overworked, according to five former employees. The New York Times recently reported, too, that Bogensberger used an off-site meeting in Napa, California, to present "slides warning employees not to be dramatic," which it said was taken by employees as a rebuke to their concerns about the pace at the company. CJ Silverio, NPM's former chief technology officer, told Business Insider that she clashed with Bogensberger after he pushed her teams to "crunch" and put in even more hours. "He was attacking the core of how I run engineering teams, my basic values of what I think is important, and what a sustainable management strategy looks like," Silverio said. Several former employees also said Bogensberger's arrival heralded a shift in NPM's corporate culture: Where before, the company had consciously tried to keep team-building events from centering around alcohol, former employees said Bogensberger frequently took employees out drinking and that alcohol started playing a larger role in NPM's corporate culture. The decision to host an off-site meeting and social events in California's Napa Valley wine country, a common locale for startup team-building events, disquieted some employees, for example. "It's not an issue, and it's not a fundamental part of our culture, and it never has been," Schlueter said on the subject of alcohol at the company. An NPM spokesperson said employees were not pressured to drink and that "no event is 'centered' around alcohol." Even Bogensberger's sartorial choices made him stand out. One former employee said NPM thought of itself as serving the "pencil necked and purple haired and queer open-source community." Bogensberger in contrast typically dressed in a polo shirt, jeans, and the NPM-branded Converse sneakers the company makes for employees, the former employees said. It also didn't help matters that Bogensberger took over a conference room — one of the few with a clear view of Lake Merritt — to be his private office. A spokesperson for NPM said Bogensberger didn't choose this room and that it was assigned to him before he arrived. The big hurry The big hurry was to launch NPM Enterprise, a paid product for businesses and Bogensberger's big bet on the future of the company. "Our revenue is what can fund our core operations," Schlueter said. "That has required doing a lot of things in a grown-up, more business-y way." And NPM Enterprise was the flagship of this strategy. Concerned employees went so far as to write a letter, sent to management, saying employees were being overworked and facing "unrealistic" deadlines with little guidance. It urged the schedule for NPM Enterprise to be pushed back because of concerns about the product's readiness and security: "We have set a product launch deadline that makes it necessary to release a product that is not up to our standards, which puts us at high risk of compromising our market position," the letter said. The product ultimately launched in February, in line with Bogensberger's schedule. "The pressure leading up to the launch was so at odds with the culture that attracted me to the company and other people to the company," a former employee said. "It's completely contradicted to the handbook and the statements our founders had made." "[NPM said it] didn't have a culture of burnout and being crappy to the support team. We were presenting ourselves as not being that, but we were in fact being that," the former employee said. An NPM spokesperson said "no employee is asked nor expected to work nights and weekends." The spokesperson said the former employees who felt otherwise "seem to confuse working hard towards a shared purpose of supporting the community and becoming customer-centric with 'intensity.'" Silverio said another sticking point were jokes made by Bogensberger about shutting down the public registry, NPM's original and most popular service, which is a free resource considered by many to be so crucial. Several NPM employees said they originally came to the company just to work on it. Silverio was dismissed by NPM in November. The NPM spokesperson said any comments made by Bogensberger were in the spirit of trying to understand why the service is so crucial and that he recognizes "the public registry is the heart and soul of the company." 'A deep atmosphere of fear' Things came to a head in March, when NPM dismissed five employees, citing a reorganization. The company issued an apology after it was widely criticized for its handling of those layoffs and characterized them as part of the growing pains of making the company more sustainable. Whatever the reason, the layoffs sowed discord and discontent among employees, sources said. They estimated that two or three people resigned from the 50-person company every week for a little while after. "Now there's definitely a deep atmosphere of fear," a former employee said. The motivation for the layoffs wasn't always clear, former employees said: The company had said the layoffs were because of a reorganization, but it put up a job posting with a job description similar to that of one of the affected employees immediately after the layoffs, they said. NPM said that "the previous engineer was laid off because they could not satisfy the requirements demanded by the company's strategic direction." Even the messaging app Slack, which had once served as the social hub for the company, came under suspicion: Sources told Business Insider that NPM executives requested private chat logs between employees. (The NPM spokesperson said this came "as part of a legal matter on the advice and counsel of our attorney.") The former employees said another demoralizing episode came when the company's employee handbook, which contained its policies — one more relic of NPM's original culture — was deleted from the company intranet. This was concerning, sources said, because the handbook laid out procedures for parental leave, vacation, and reporting sexual harassment. Without it, sources said, there's little guidance on how to handle those and similar situations at NPM. Three of those employees said executives promised to replace it but that they're still waiting. NPM said making a new one was a high priority. The Wombat Developer Union Others have a more sinister explanation for those layoffs. Four of the five dismissed employees were involved in unionization efforts, sources said. Before the leadership transition, employees openly talked about unionizing, a former employee said. They called this concept the "Wombat Developer Union," or "wdu," because the initials look like "npm" upside down. They even made laptop stickers. "We would joke about it openly," the employee said. "It wasn't like we were afraid." Things got more serious earlier this year, when NPM's workforce discussed actually starting a union, a former employee said. Employees faced increasingly long hours, often stretching through nights and weekends, all to meet what they see as arbitrary deadlines. For example, a designer had to work nights and weekends to meet unrealistic deadlines, but was fired shortly after the launch, two sources said. Five people filed complaints with the National Labor Relations Board (NLRB), and NPM settled with three of them. Initially, NPM only offered a dollar for settlement, in addition to back pay, according to a source familiar with the matter. An NPM spokesperson disputed this and said it ultimately agreed to a settlement in which each employee received "back pay, interest on the back pay, reimbursement of some small expenses and the severance benefits we offered them at the time of their dismissal." Voss, who describes himself as a socialist, said he was unaware of unionization efforts. Laurie Voss, the chief data officer and cofounder of NPM. YouTube screenshot "While I was at NPM, no one ever came to me and proposed to unionize," Voss told Business Insider. "If they had proposed to unionize, I would be in favor. It was really a huge surprise for me that we received that NLRB complaint." Likewise, Schlueter said: "People who are saying I'm a union buster don't know me very well." NPM's future With all that's transpired, sources said morale at NPM has shown little signs of improvement. In May, Schlueter cut an all-hands meeting short after employees spent time lavishing praise on coworkers who were preparing to leave NPM, according to two people present. In June, someone anonymously sent the book "Corporate Finance for Dummies" to Chief Operating Officer Dawn Umlah, in an apparent statement on the company's handling of the business. NPM said Umlah donated the book to charity. What's more, GitHub, which is owned by Microsoft and backed by its capital and technology, recently launched a direct competitor to NPM. Still, Schlueter says NPM is working to find ways to address those concerns of employee burnout. In May, NPM was restructured to have engineering, security, product, and support all report to Chief Technology Officer Ahmad Nassri, in a play to free up resources and streamline processes. "Burnout doesn't just happen from working too much," Schlueter said. "Burnout happens in ways that feel unproductive. We have worked to take a look at where we didn't do as well as we hoped. Moving on from that is identifying the communication structures between product and support and engineering." Two sources also said they felt NPM did little to promote or invest in people of color, as it was rare to have them in management positions. The company said it has made progress on this front, and that a year ago, NPM had only one woman and no minorities on its executive team, but it has now achieved gender balance across the company and continues "to focus on achieving the same improvements racially, as does most any responsible company." Most of all, former employees expressed concern over what this turmoil means for the future of NPM, the service. If NPM failed, it would pose a serious risk to the public registry's future. These stumbles affect more than just 50 NPM employees, they said. An exodus of talent leaving affects the 11 million developers who use the platform, and then all the people who use the software that those developers build. It could detrimentally affect the whole internet ecosystem. "It's disappointing to me personally," Silverio said. "I wanted to work for this company because I believed in the mission, and I also came to believe the founders when they said what matters to them. It's a shock to find that [Voss is] actually completely willing to sell out all of his values for money for a shot at making his stock options worthwhile." As for Voss, in July, he announced he was resigning from NPM, where he served as chief data officer. "As a cofounder, having been there for five years, there's all sorts of things that go into that decision," Voss said. "It was a very complicated and difficult decision, and that's all I want to say." Schlueter, who remains with NPM, says that he's committed to the company's original ideals. Whether staying independent or getting acquired, he's not attached to a specific outcome — as long as it sustains the registry. "Whatever I do, I'm going to be weighing it on the benefit of the registry," Schlueter said. "That's why I started this company. It was the best option in order to create a level of sustainability. This is my intrinsic motivation regarding this. Just to lay it out, if I had two exits, if one had more money but the other put the registry in a better place, I would take a smaller payout." Read Bogensberger's full statement: Our long-term mission is to keep the registry running and free to all. Isaac Schlueter, the creator of npm, founded npm, Inc. in 2014 to insure the registry would be self-sustaining and available for free to the millions of users who depend on it over the long term. An open source project needs more than passion to thrive. The sustainability of a free service like the npm registry requires people and infrastructure, and both are expensive. We define sustainability as the ability to build a viable business through proactive, value-adding community engagement, the development of mutually beneficial partnerships and the sale of commercial products that meet the needs of developers at startups and large corporations alike. The current team at npm, Inc. is working hard to finally deliver on the company's original mission to become self-sustainable. Like most startups, we are working toward a future where we don't depend on venture capital, but instead generate revenue from commercial customers while continuing to improve the free services such as we've done over the past year; by overcoming major scaling issues with our legacy technology, improving npm Audit availability as well as delivering improvements to the registry's search functionality. We have made great progress on those projects in addition to meeting our commercialization goals, and I am extremely proud of the current npm team that is making it all happen. Got a tip? Contact this reporter via email at rmchan@businessinsider.com, Telegram at @rosaliechan, or Twitter DM at @rosaliechan17. (PR pitches by email only, please.) Other types of secure messaging available upon request. You can also contact Business Insider securely via SecureDrop. Source: NPM employees worry about company future amid layoffs, resignations – Business Insider

    Read at 12:47 am, Aug 15th

  • When Good People Go Bad: Mediating Community Conflict

    At the CMX Connect Cape Town meetup last week, we had a great chat about conflict resolution within communities – whether they are online or offline.

    Read at 05:52 pm, Aug 15th

  • Why The Cops Won't Help You When You're Getting Stabbed

    Ever wondered what it's like to stop a psycho on a killing spree? It isn't as awesome as you'd think. CHECK US OUT on AMAZON: https://www.amazon.com/v/crackedWant some more SOME NEWS, starring Cody Johnston? Then CLICK HERE: https://goo.gl/RCEuwaSUBSCRIBE HERE: http://goo.gl/ITTCPWFor additional pe

    Read at 01:42 pm, Aug 15th

  • ‘Loud, obsessive, tribal’: the radicalisation of remain

    They hate Boris Johnson and Jeremy Corbyn. They no longer trust the BBC. They love civil servants, legal experts and James O’Brien. And now, consumed by the battle against Brexit, hardcore remainers are no longer the moderates. By

    Read at 01:35 pm, Aug 15th

Day of Aug 14th, 2019

  • FDNY Reviewed 4chan Post About Jeffrey Epstein’s Death

    The New York City Fire Department looked into whether an employee posted about Jeffrey Epstein’s death on a notorious internet message board prior to officials announcing it to the public, BuzzFeed News has learned. After telling BuzzFeed News the post was "under review," an FDNY spokesperson said authorities "determined this alleged information did not come from the Fire Department." "An investigation is a formal act which brings about a process which includes interviewing witnesses, serving notice, determining credibility of witness statements — and that was not warranted nor did it take place here. This determination was made after a review of the incident. We looked at the information provided by [a BuzzFeed News] reporter and we looked at our own records and there was no match," said FDNY spokesperson Frank Dwyer, who added that the FDNY's Office of Healthcare Compliance conducted the review. "It doesn't match our medical records." Almost 40 minutes before ABC News first reported Epstein’s death on Twitter, someone posted still-unverified details on 4chan, the anonymous message board popular with far-right trolls and white nationalists. “[D]ont ask me how I know, but Epstein died an hour ago from hanging, cardiac arrest. Screencap this,” read the post, which was published at 8:16 a.m. alongside an image of Pepe, the green frog that has become a mascot for right-wing internet trolls. That message was posted 38 minutes before the first tweet about Epstein’s death from Aaron Katersky, an ABC News reporter, at 8:54 a.m. Five minutes later, the main ABC News account tweeted an article about Epstein's death. After publishing the post, other 4chan users egged on the author. When they expressed doubt, the original poster added more information to the discussion thread, including a detailed breakdown of the procedures allegedly used to resuscitate Epstein, which suggest the poster may have been a first responder, medical worker, or otherwise privy to details about efforts to resuscitate the disgraced financier. Dwyer told BuzzFeed News he “could not verify the accuracy” of information in the 4chan post. But he said any medical professional who divulges patient information without consent is in violation of a federal health privacy law, HIPAA, and that FDNY would look into it. “The department will review this incident,” he said at the time. The FDNY later said that it "is reviewing this incident, there is no investigation" — but would not describe the difference between a review and investigation. Oren Barzilay, the president of the union for EMT workers Local 2507 in New York, said, “our members do not release this type of confidential information, this looks like a 3rd party info.” Barzilay also told BuzzFeed News the union would investigate the potential breach of confidentiality “if such a claim came forward.” “There's serious consequences for those violations. Discipline. Suspensions. Civil penalties, etc,” Barzilay said in an email. The medical examiner's office and the Metropolitan Correctional Center both declined to comment. Spokespeople for the New York Presbyterian Hospital have not responded to repeated emails and calls requesting comment. The full details of Epstein’s death won’t be known until the final coroner’s report is released, meaning the 4chan post laying out the alleged treatment he received cannot be fully verified. An EMS expert contacted by BuzzFeed News said the details in the post are consistent with standard practices. Information released by the Federal Bureau of Prisons also appears to line up with some of what was posted on 4chan. Dr. Keith Wesley, an emergency medicine physician who has authored several EMS textbooks and articles, viewed the 4chan post at BuzzFeed News’ request and said it lays out standard procedures for paramedics. “This sounds like standard American Heart Association guidelines, which most EMS agencies use,” Wesley said. Part of the post refers to “telemetry advised bicarb.” According to Wesley, this could mean the first responders were also speaking with the hospital as they were trying to resuscitate their patient. “Telemetry implies the paramedics were in contact with a medical control hospital who then gave orders to give Sodium Bicarbonate, bicarb which is designed to reverse the acid buildup in the blood from prolonged cardiac arrest,” Wesley said in an email. “If one of the medics posted this separately that’s a breach of protocol,” he added. “If there was identifying information on the patient, that is a violation of Federal HIPPA law.” The 4chan user made six posts about Epstein’s death. One of them claimed that attempts to resuscitate Epstein were made for 40 minutes before he was transported to the hospital, at which point medical personnel tried to revive him for another 20 minutes. Those details are at least partially consistent with the information regarding Epstein’s death that have been publicly released by the Federal Bureau of Prisons. “Pt transported to Lower Manhattan ER and worked for 20 minutes and called. Hospital administrator was alerted, preparing statements,” said the 4chan post. “Staff requested emergency medical services (EMS) and life-saving efforts continued,” said a statement released by the Department of Justice’s Federal Bureau of Prisons on Saturday. “Mr. Epstein was transported by EMS to a local hospital for treatment of life-threatening injuries, and subsequently pronounced dead by hospital staff.” It’s not clear who posted the news of Epstein’s death on 4chan, but first responders and hospital staff would have access to information laid out in the post. The reaction of 4chan to the news was explosive. Some users didn’t believe the author while others launched straight into conspiracy theories that have since engulfed news of Epstein’s death. The 4chan thread was first found by Konrad Iturbe, a developer based in Barcelona who was researching conspiracies surrounding Epstein online. “That'll keep the conspiracies forever!” Iturbe said about the 4chan posts. Source: FDNY Reviewed 4chan Post About Jeffrey Epstein’s Death

    Read at 04:00 pm, Aug 14th

  • Democratic Socialists look to take over New York's powerful labor unions

    NEW YORK — A left-leaning political organization that publicly backed Rep. Alexandria Ocasio-Cortez in her insurgent victory last year was also quietly plotting to penetrate another New York City power source — labor unions.

    Read at 10:46 pm, Aug 14th

  • Why Philosophers Shouldn’t Sign Petitions

    Our job is to persuade by argument, not by wielding influence. Ms. Callard is an associate professor of philosophy at the University of Chicago.

    Read at 10:39 pm, Aug 14th

  • DSA Convention 2019: Sanderism and the Tyranny of the Procedural

    This weekend in Atlanta, the Democratic Socialists of America (DSA) will be holding their second national convention since it became the largest socialist organization in the United States in 2016.

    Read at 10:30 pm, Aug 14th


    We stand for active ideological struggle because it is the weapon for ensuring unity within the Party and the revolutionary organizations in the interest of our fight. Every Communist and revolutionary should take up this weapon.

    Read at 02:11 pm, Aug 14th

  • Do America’s Socialists Have a Race Problem?

    Inside a raging debate that has split the country's most exciting new political movement

    Read at 01:48 pm, Aug 14th

  • 3 Kinds of Good Tech Debt — Squarespace / Engineering

    “Tech debt” is a dirty word in the software engineering world. It’s often said with an air of regret; a past mistake that will eventually need to be atoned for with refactoring. Financial debt isn’t universally reviled in the same way.

    Read at 01:31 pm, Aug 14th

  • Jeffrey Epstein’s computers seized during FBI raid on private island

    Drone video shows FBI agents and NYPD cops seizing computer equipment from Jeffrey Epstein‘s mansion on a private island in the Caribbean, according to a report Tuesday. The video was shot Monday during a raid on Little St.

    Read at 01:25 pm, Aug 14th

  • Corrections officers may have falsified reports saying they checked on Jeffrey Epstein

    New York — Corrections officers may have falsified reports saying they checked on Jeffrey Epstein as required by protocol, according to a law enforcement source with knowledge of the investigation into Epstein's apparent suicide.

    Read at 01:25 pm, Aug 14th

  • NYPD chief admits city has poor mental health care in wake of cop suicides

    Hours after the NYPD lost its eighth officer to suicide this year, NYPD Chief of Department Terence Monahan admitted that the city’s health care system makes it tough for cops to seek mental health care.

    Read at 01:23 pm, Aug 14th

  • This Tumblr Post About Extremism From Four Years Ago Goes Viral After Every Shooting. Its Creator Finally Explains It.

    The original poster, Shitpostradamus, said a blog run by a white nationalist brony inspired him to make the chart. There’s an old blog post that circulates online every time another white man radicalized via the internet carries out a terrorist attack.

    Read at 01:22 pm, Aug 14th

  • The FBI Descends On Jeffrey Epstein's 'Pedophile Island'

    Jeffrey Epstein has died, but officials still have quite a bit of work ahead of them; specifically, searching the contents of the private island on which he was said to have trafficked many of his victims. On Monday, the FBI descended on the island Little St.

    Read at 12:04 am, Aug 14th

Day of Aug 13th, 2019

  • All the New ES2019 Tips and Tricks | CSS-Tricks

    The ECMAScript standard has been updated yet again with the addition of new features in ES2019. Now officially available in node, Chrome, Firefox, and Safari you can also use Babel to compile these features to a different version of JavaScript if you need to support an older browser. Let’s look at what’s new! Object.fromEntries In ES2017, we were introduced to Object.entries. This was a function that translated an object into its array representation. Something like this: let students = { amelia: 20, beatrice: 22, cece: 20, deirdre: 19, eloise: 21 } Object.entries(students) // [ // [ 'amelia', 20 ], // [ 'beatrice', 22 ], // [ 'cece', 20 ], // [ 'deirdre', 19 ], // [ 'eloise', 21 ] // ] This was a wonderful addition because it allowed objects to make use of the numerous functions built into the Array prototype. Things like map, filter, reduce, etc. Unfortunately, it required a somewhat manual process to turn that result back into an object. let students = { amelia: 20, beatrice: 22, cece: 20, deirdre: 19, eloise: 21 } // convert to array in order to make use of .filter() function let overTwentyOne = Object.entries(students).filter(([name, age]) => { return age >= 21 }) // [ [ 'beatrice', 22 ], [ 'eloise', 21 ] ] // turn multidimensional array back into an object let DrinkingAgeStudents = {} for (let [name, age] of DrinkingAgeStudents) { DrinkingAgeStudents[name] = age; } // { beatrice: 22, eloise: 21 } Object.fromEntries is designed to remove that loop! It gives you much more concise code that invites you to make use of array prototype methods on objects. let students = { amelia: 20, beatrice: 22, cece: 20, deirdre: 19, eloise: 21 } // convert to array in order to make use of .filter() function let overTwentyOne = Object.entries(students).filter(([name, age]) => { return age >= 21 }) // [ [ 'beatrice', 22 ], [ 'eloise', 21 ] ] // turn multidimensional array back into an object let DrinkingAgeStudents = Object.fromEntries(overTwentyOne); // { beatrice: 22, eloise: 21 } It is important to note that arrays and objects are different data structures for a reason. There are certain cases in which switching between the two will cause data loss. The example below of array elements that become duplicate object keys is one of them. let students = [ [ 'amelia', 22 ], [ 'beatrice', 22 ], [ 'eloise', 21], [ 'beatrice', 20 ] ] let studentObj = Object.fromEntries(students); // { amelia: 22, beatrice: 20, eloise: 21 } // dropped first beatrice! When using these functions make sure to be aware of the potential side effects. Support for Object.fromEntries Chrome Firefox Safari Edge 75 67 12.1 No 🔍 We can use your help. Do you have access to testing these and other features in mobile browsers? Leave a comment with your results — we'll check them out and include them in the article. Array.prototype.flat Multi-dimensional arrays are a pretty common data structure to come across, especially when retrieving data. The ability to flatten it is necessary. It was always possible, but not exactly pretty. Let’s take the following example where our map leaves us with a multi-dimensional array that we want to flatten. let courses = [ { subject: "math", numberOfStudents: 3, waitlistStudents: 2, students: ['Janet', 'Martha', 'Bob', ['Phil', 'Candace']] }, { subject: "english", numberOfStudents: 2, students: ['Wilson', 'Taylor'] }, { subject: "history", numberOfStudents: 4, students: ['Edith', 'Jacob', 'Peter', 'Betty'] } ] let courseStudents = courses.map(course => course.students) // [ // [ 'Janet', 'Martha', 'Bob', [ 'Phil', 'Candace' ] ], // [ 'Wilson', 'Taylor' ], // [ 'Edith', 'Jacob', 'Peter', 'Betty' ] // ] [].concat.apply([], courseStudents) // we're stuck doing something like this In comes Array.prototype.flat. It takes an optional argument of depth. let courseStudents = [ [ 'Janet', 'Martha', 'Bob', [ 'Phil', 'Candace' ] ], [ 'Wilson', 'Taylor' ], [ 'Edith', 'Jacob', 'Peter', 'Betty' ] ] let flattenOneLevel = courseStudents.flat(1) console.log(flattenOneLevel) // [ // 'Janet', // 'Martha', // 'Bob', // [ 'Phil', 'Candace' ], // 'Wilson', // 'Taylor', // 'Edith', // 'Jacob', // 'Peter', // 'Betty' // ] let flattenTwoLevels = courseStudents.flat(2) console.log(flattenTwoLevels) // [ // 'Janet', 'Martha', // 'Bob', 'Phil', // 'Candace', 'Wilson', // 'Taylor', 'Edith', // 'Jacob', 'Peter', // 'Betty' // ] Note that if no argument is given, the default depth is one. This is incredibly important because in our example that would not fully flatten the array. let courseStudents = [ [ 'Janet', 'Martha', 'Bob', [ 'Phil', 'Candace' ] ], [ 'Wilson', 'Taylor' ], [ 'Edith', 'Jacob', 'Peter', 'Betty' ] ] let defaultFlattened = courseStudents.flat() console.log(defaultFlattened) // [ // 'Janet', // 'Martha', // 'Bob', // [ 'Phil', 'Candace' ], // 'Wilson', // 'Taylor', // 'Edith', // 'Jacob', // 'Peter', // 'Betty' // ] The justification for this decision is that the function is not greedy by default and requires explicit instructions to operate as such. For an unknown depth with the intention of fully flattening the array the argument of Infinity can be used. let courseStudents = [ [ 'Janet', 'Martha', 'Bob', [ 'Phil', 'Candace' ] ], [ 'Wilson', 'Taylor' ], [ 'Edith', 'Jacob', 'Peter', 'Betty' ] ] let alwaysFlattened = courseStudents.flat(Infinity) console.log(alwaysFlattened) // [ // 'Janet', 'Martha', // 'Bob', 'Phil', // 'Candace', 'Wilson', // 'Taylor', 'Edith', // 'Jacob', 'Peter', // 'Betty' // ] As always, greedy operations should be used judiciously and are likely not a good choice if the depth of the array is truly unknown. Support for Array.prototype.flat Chrome Firefox Safari Edge 75 67 12 No Chrome Android Firefox Android iOS Safari IE Mobile Samsung Internet Android Webview 75 67 12.1 No No 67 Array.prototype.flatMap With the addition of flat we also got the combined function of Array.prototype.flatMap. We've actually already seen an example of where this would be useful above, but let's look at another one. What about a situation where we want to insert elements into an array. Prior to the additions of ES2019, what would that look like? let grades = [78, 62, 80, 64] let curved = grades.map(grade => [grade, grade + 7]) // [ [ 78, 85 ], [ 62, 69 ], [ 80, 87 ], [ 64, 71 ] ] let flatMapped = [].concat.apply([], curved) // now flatten, could use flat but that didn't exist before either // [ // 78, 85, 62, 69, // 80, 87, 64, 71 // ] Now that we have Array.prototype.flat we can improve this example slightly. let grades = [78, 62, 80, 64] let flatMapped = grades.map(grade => [grade, grade + 7]).flat() // [ // 78, 85, 62, 69, // 80, 87, 64, 71 // ] But still, this is a relatively popular pattern, especially in functional programming. So having it built into the array prototype is great. With flatMap we can do this: let grades = [78, 62, 80, 64] let flatMapped = grades.flatMap(grade => [grade, grade + 7]); // [ // 78, 85, 62, 69, // 80, 87, 64, 71 // ] Now, remember that the default argument for Array.prototype.flat is one. And flatMap is the equivalent of combing map and flat with no argument. So flatMap will only flatten one level. let grades = [78, 62, 80, 64] let flatMapped = grades.flatMap(grade => [grade, [grade + 7]]); // [ // 78, [ 85 ], // 62, [ 69 ], // 80, [ 87 ], // 64, [ 71 ] // ] Support for Array.prototype.flatMap Chrome Firefox Safari Edge 75 67 12 No Chrome Android Firefox Android iOS Safari IE Mobile Samsung Internet Android Webview 75 67 12.1 No No 67 String.trimStart and String.trimEnd Another nice addition in ES2019 is an alias that makes some string function names more explicit. Previously, String.trimRight and String.trimLeft were available. let message = " Welcome to CS 101 " message.trimRight() // ' Welcome to CS 101' message.trimLeft() // 'Welcome to CS 101 ' message.trimRight().trimLeft() // 'Welcome to CS 101' These are great functions, but it was also beneficial to give them names that more aligned with their purpose. Removing starting space and ending space. let message = " Welcome to CS 101 " message.trimEnd() // ' Welcome to CS 101' message.trimStart() // 'Welcome to CS 101 ' message.trimEnd().trimStart() // 'Welcome to CS 101' Support for String.trimStart and String.trimEnd Chrome Firefox Safari Edge 75 67 12 No Optional catch binding Another nice feature in ES2019 is making an argument in try-catch blocks optional. Previously, all catch blocks passed in the exception as a parameter. That meant that it was there even when the code inside the catch block ignored it. try { let parsed = JSON.parse(obj) } catch(e) { // ignore e, or use console.log(obj) } This is no longer the case. If the exception is not used in the catch block, then nothing needs to be passed in at all. try { let parsed = JSON.parse(obj) } catch { console.log(obj) } This is a great option if you already know what the error is and are looking for what data triggered it. Support for Optional Catch Binding Chrome Firefox Safari Edge 75 67 12 No Function.toString() changes ES2019 also brought changes to the way Function.toString() operates. Previously, it stripped white space entirely. function greeting() { const name = 'CSS Tricks' console.log(`hello from ${name}`) } greeting.toString() //'function greeting() {nconst name = 'CSS Tricks'nconsole.log(`hello from ${name} //`)n}' Now it reflects the true representation of the function in source code. function greeting() { const name = 'CSS Tricks' console.log(`hello from ${name}`) } greeting.toString() // 'function greeting() {n' + // " const name = 'CSS Tricks'n" + // ' console.log(`hello from ${name}`)n' + // '}' This is mostly an internal change, but I can’t help but think this might also make the life easier of a blogger or two down the line. Support for Function.toString And there you have it! The main feature additions to ES2019. There are also a handful of other additions that you may want to explore. Those include: Happy JavaScript coding! Source: All the New ES2019 Tips and Tricks | CSS-Tricks

    Read at 08:07 pm, Aug 13th

  • Dear Disgruntled White Plantation Visitors, Sit Down. – Afroculinaria

    Dear Disgruntled White Plantation Visitors, Hi! My name is Michael W. Twitty and I’m one of those interpreters who has watched you squirm or run away. I’m not a reenactor, because G-d forbid I reenact anything for the likes of you; but I am an interpreter, a modern person who is charged with educating you about the past. I take my job seriously because frankly you’re not the one I’m centering. I’m performing an act of devotion to my Ancestors. This is not about your comfort, it’s about honoring their story on it’s own terms in context. For over a decade I have been working towards my personal goal of being the first Black chef in 150 years to master the cooking traditions of my colonial and Antebellum ancestors. Five trips to six West African nations and more on the way, and having cooked in almost every former slaveholding state beneath the Mason-Dixon line, my work is constant, unrelenting mostly because I have to carve my way through a forest of stereotypes and misunderstandings to bring our heritage to life. I also just want to preserve the roots of our cooking before they’re gone. Because minds like yours created the “happy darky,” some people of color are ashamed of my work. Although I am none of the things they imagine me to be, I can understand why they are confused about what I (and many people like me) do. Once upon a time folks like yourselves wanted to have a national Mammy monument on the Mall, to remind us about the “proper” role we were meant to occupy and to praise our assumed loyalty. No, our forebears are the real greatest generation. With malice towards none they constantly took their strike at freedom and yet their heroism was obscured because you guessed it, white supremacy, had to have the final say. Southern food is my vehicle for interpretation because it is not apolitical. It is also drenched in all the dreadful funkiness of the history it was created in. It’s not my job to comfort you. It’s not my job to assuage any guilt you may feel. That’s really none of my business. My job is to show you that my Ancestors, (and some of yours quiet as its kept…go get your DNA done…like right now…talking to you Louisiana and South Carolina…) resisted enslavement by maintaining links to what scholar Charles D. Joyner famously called a “culinary grammar” that contained whole narratives that reached into spirituality, health practices, linguistics, agricultural wisdom and environmental practices that constituted in the words of late historian William D. Piersen, “a resistance “too civilized to notice.” Want to read about it? Since you already know I’m a literate runaway from the American educational system, I wrote an award winning book called The Cooking Gene. Like Eddie Murphy said, “but buy my record first…” (BTW it’s not a cookbook its the story of my family told through culinary history from Africa to America and from enslavement to freedom.) What’s most telling about the above quote and others is how blithely unaware you are about the real American struggle for freedom. When you’re in one of those hot ass kitchens watching me melt you are secretly telling yourself you’re glad you’re not me–or them. And yes, I’m about to go Designing Women/Julia Sugarbaker (in that pink hoop skirt) on you…so you might want to run now. Thanks to a viral tweet the whole country sees what me and my colleagues have seen for quite some time. We get it. You want romance, Moonlight and Magnolias, big Greek Revival columns, prancing belles in crinoline, perhaps a distinguished hoary headed white dude with a Van Dyke beard in a white suit with a black bow tie that looks like he’s about to bring you some hot and fresh chicken some faithful Mammy sculpture magically brought to life has prepared for you out back. The Old South may be your American Downton Abbey but it is our American Horror Story, even under the best circumstances it represents the extraction of labor, talent and life we can never get back. When I do this work, it drains me, but I do it because I want my Ancestors to know not only are they not forgotten but I am here to testify that I am their wildest dreams manifest. While your gall and nerve anonymously preserved for eternity online is cute, I thought you might want to be further disturbed not by the actions of the dead, but by those of the living: Like remember when you took the form of the docent in Virginia who told me, “Look, you don’t have to go on about the history, just tell them you’re the cook and be done with it…” Or remember when you waltzed in with a MAGA hat and told me “I know what it’s like to be persecuted like a slave. I’m an evangelical Christian in America. Its scary!” (More power to you for your faith, but that analogy? Or skewed perception? Or saying that nonsense to my face with the assumed confidence that I wouldn’t respond?) My personal favorite was when I spilled some of the contents of a heavy pot of water as the light was dying and you all laughed and one of you said…and I could hear you…”This boy doesn’t know what he’s doing.” “Boy.” I was exhausted. I had been cooking over an open hearth for 7 hours. One enslaved cook in Martinique was thrown alive into an oven for burning a cake. How do we know? His mistress calmly showed his charred remains to her guest after the meal. Spilling or burning food could have meant my ass. How about that time you asked me if I lived in that kitchen with the dirt floor. Or when you said I was “well fed” and had “nothing to complain about.” “This isnt sooo bad. White poor people had it just as bad if not worse.” I do so love it when folks like you ask me “What are you making me for dinner?” In South Carolina there was that time four of you walked in grinning and salivating as you often do, and were all ready to be regaled of the good old days until a German tourist scratched your record. He said, “How do you feel as a Black American, dressing like your Ancestors and cooking and working this way?” You started to frown. I said, “Slavery was colloquial and discretionary, one story doesn’t tell all. But its important to remember that our Ancestors survived this. Survived slavery.” He pushed me further. You gestured towards the door. “How do people feel about slavery?” My retort was fast. “How do you feel about the Shoah? How do you feel about the Holocaust?” The German said, “The Holocaust was a terrible thing and never should have happened. We were children when Germany was coming out of the ashes. But it is a shame upon our nation.” As the four of you turned to leave, I got in a good one: “That’s a phrase you will almost never hear some white Southerners say. “Slavery was a terrible thing and never should have happened.” But…the South is not to be indicted on it’s own. Without Northern slave trade captains, merchants, mill owners, and even universities that had stock in the enslaved, the Southern economy could not have flourished. (And please miss me with “you sold your own people..” the corporate identity of Blackness was not a feature when African, Arab and European elites and merchants conspired during the time of the slave trades…you cant learn everything from the crossword section of StormFront…) Furthermore your immigrant ancestors would never have had a land of opportunity to come to. Or a people to walk on as your folks climbed towards whiteness. The most valuable “commodity” in Antebellum America during the years of exponential growth was not wheat, corn, tobacco, rice or even cotton. The most important commodity of the mid 19th century in America, was the Black child, and behind the children, the body of the Black woman. Dont get me wrong. This isnt about being anti-white or ignoring other people’s traumas. But if you do think I don’t like you because you identify as white that’s not it. I suspect what you might be doing—identifying with heathy slices of weaponized racial power, privilege, attainment and achievement obtained in a hierarchical exploitative American dream between two pieces of unexamined whiteness, I guess the plantation isn’t the ideal place for you to escape. The moment genealogist Lon Outen helped me discover the plantation where my great great great grandfather was enslaved near Lancaster, South Carolina. Facing my/our past has been my life’s journey. It’s also been at times devastating and painful. But reflection in no way equals one second in the lives of the enslaved women and men whose blood flows in my veins. I had the privilege of rediscovering my roots on a North Carolina plantation at a dinner we prepared for North Carolinisns of all backgrounds. Knowing that the enslaved people who once occupied those cabins could never have dreamed of that rainbow of people sitting together as equals in prayer, food and fellowship while my Asante and Mende roots were being uncovered after centuries of obfuscation was for me a holy moment. You miss out on magic like that when you shut down your soul. Going to what few plantations remain, your job is to go with respect and homage and light. You know, like I felt at the Tenement Museum and learned about the first American experience of those who passed through Ellis Island. Your job is to be thankful and grateful. Your job is to not just hear but listen. Your job is to know that Black lives mattered then just as they do now. Your job is to face the reality that hardships and hurt have been passed down from the American Downton Abbey, the American plantation. Scholar and museum director Kathe Hambrick showing me lists of enslaved Ancestors on a Louisiana sugar plantation. Not far away a Black family was living without running water…in 2012. Rape happened there..to the point where almost every African American with long roots here bears that evidence in their DNA. Theft of our culture. Forced assimilation. The breaking up of families…like all of us. Of course there was economic and legal exploitation and oppression, the effects of which have never been extricated from the American story. Imagine what it would be like to meet your long lost family and to discover that in merely 2 generations or even less, millenia of knowledge had been beaten out of you…Abomey, Benin, West Africa, 2019 But because enslavement was so damn fuzzy…we forget that those maudlin moments of blurred lines passed down by sentimental whites were purchased with pain. I tell my audiences that enslavement wasn’t always whips and chains; but it was the existential terror that at any moment 3/5ths could give way to its remainder, and unfortunately often did. Guilt is not where to start. If you go back start with humility. Have some shame that NONE of us are truly taught this. Be like the working class white lady whose family I met in Louisiana who brought her young kids because she “wanted them to know the whole story, the story of American history is Black history.” Too bad she ain’t going viral. Wherever you are my cousin, I salute you. Go to Whitney Plantation in Louisiana. Seriously. You won’t regret it. (For the non disgruntled.) Right now we need people to exercise their compassion muscle over their dissatisfaction or disappointment. Right now we need people to see the parallels. Right now we need people to remember the insidious ways history repeats itself. Right now we need to be better humans to each other. Right now we need people to remember the righteous who sacrificed so we could tweet and leave awful online reviews. Y’all come back now y’here? Like this: Like Loading... Source: Dear Disgruntled White Plantation Visitors, Sit Down. – Afroculinaria

    Read at 07:58 pm, Aug 13th

  • Ned Batchelder: Why your mock doesn’t work

    Friday 2 August 2019 Mocking is a powerful technique for isolating tests from undesired interactions among components. But often people find their mock isn’t taking effect, and it’s not clear why. Hopefully this explanation will clear things up. BTW: it’s really easy to over-use mocking. These are good explanations of alternative approaches: A quick aside about assignment Before we get to fancy stuff like mocks, I want to review a little bit about Python assignment. You may already know this, but bear with me. Everything that follows is going to be directly related to this simple example. Variables in Python are names that refer to values. If we assign a second name, the names don’t refer to each other, they both refer to the same value. If one of the names is then assigned again, the other name isn’t affected: x23x = 23xy23y = xxy1223x = 12 If this is unfamiliar to you, or you just want to look at more pictures like this, Python Names and Values goes into much more depth about the semantics of Python assignment. Importing Let’s say we have a simple module like this: # mod.py val = "original" def update_val():     global val     val = "updated" We want to use val from this module, and also call update_val to change val. There are two ways we could try to do it. At first glance, it seems like they would do the same thing. The first version imports the names we want, and uses them: # code1.py from mod import val, update_val print(val) update_val() print(val) The second version imports the module, and uses the names as attributes on the module object: # code2.py import mod print(mod.val) mod.update_val() print(mod.val) This seems like a subtle distinction, almost a stylistic choice. But code1.py prints “original original”: the value hasn’t changed! Code2.py does what we expected: it prints “original updated.” Why the difference? Let’s look at code1.py more closely: # code1.py from mod import val, update_val print(val) update_val() print(val) After “from mod import val”, when we first print val, we have this: mod.pyval‘original’code1.pyval “from mod import val” means, import mod, and then do the assignment “val = mod.val”. This makes our name val refer to the same object as mod’s name val. After “update_val()”, when we print val again, our world looks like this: mod.pyval‘original’‘updated’code1.pyval update_val has reassigned mod’s val, but that has no effect on our val. This is the same behavior as our x and y example, but with imports instead of more obvious assignments. In code1.py, “from mod import val” is an assignment from mod.val to val, and works exactly like “y = x” does. Later assignments to mod.val don’t affect our val, just as later assignments to x don’t affect y. Now let’s look at code2.py again: # code2.py import mod print(mod.val) mod.update_val() print(mod.val) The “import mod” statement means, make my name mod refer to the entire mod module. Accessing mod.val will reach into the mod module, find its val name, and use its value. mod.pyval‘original’code2.pymod Then after “update_val()”, mod’s name val has been changed: mod.pyval‘original’‘updated’code2.pymod Now we print mod.val again, and see its updated value, just as we expected. OK, but what about mocks? Mocking is a fancy kind of assignment: replace an object (or function) with a different one. We’ll use the mock.patch function in a with statement. It makes a mock object, assigns it to the name given, and then restores the original value at the end of the with statement. Let’s consider this (very roughly sketched) product code and test: # product.py from os import listdir def my_function():     files = listdir(some_directory)     # ... use the file names ... # test.py def test_it():     with mock.patch("os.listdir") as listdir:         listdir.return_value = ['a.txt', 'b.txt', 'c.txt']         my_function() After we’ve imported product.py, both the os module and product.py have a name “listdir” which refers to the built-in listdir() function. The references look like this: os modulelistdirlistdir()product.pylistdir The mock.patch in our test is really just a fancy assignment to the name “os.listdir”. During the test, the references look like this: os modulelistdirlistdir()mock!product.pylistdir You can see why the mock doesn’t work: we’re mocking something, but it’s not the thing our product code is going to call. This situation is exactly analogous to our code1.py example from earlier. You might be thinking, “ok, so let’s do that code2.py thing to make it work!” If we do, it will work. Your product code and test will now look like this (the test code is unchanged): # product.py import os def my_function():     files = os.listdir(some_directory)     # ... use the file names ... # test.py def test_it():     with mock.patch("os.listdir") as listdir:         listdir.return_value = ['a.txt', 'b.txt', 'c.txt']         my_function() When the test is run, the references look like this: os modulelistdirlistdir()mock!product.pyos Because the product code refers to the os module, changing the name in the module is enough to affect the product code. But there’s still a problem: this will mock that function for any module using it. This might be a more widespread effect than you intended. Perhaps your product code also calls some helpers, which also need to list files. The helpers might end up using your mock (depending how they imported os.listdir!), which isn’t what you wanted. Mock it where it’s used The best approach to mocking is to mock the object where it is used, not where it is defined. Your product and test code will look like this: # product.py from os import listdir def my_function():     files = listdir(some_directory)     # ... use the file names ... # test.py def test_it():     with mock.patch("product.listdir") as listdir:         listdir.return_value = False         my_function() The only difference here from our first try is that we mock “product.listdir”, not “os.listdir”. That seems odd, because listdir isn’t defined in product.py. That’s fine, the name “listdir” is in both the os module and in product.py, and they are both references to the thing you want to mock. Neither is a more real name than the other. By mocking where the object is used, we have tighter control over what callers are affected. Since we only want product.py’s behavior to change, we mock the name in product.py. This also makes the test more clearly tied to product.py. As before, our references look like this once product.py has been fully imported: os modulelistdirlistdir()product.pylistdir The difference now is how the mock changes things. During the test, our references look like this: os modulelistdirlistdir()product.pylistdirmock! The code in product.py will use the mock, and no other code will. Just what we wanted! Is this OK? At this point, you might be concerned: it seems like mocking is kind of delicate. Notice that even with our last example, how we create the mock depends on something as arbitrary as how we imported the function. If our code had “import os” at the top, we wouldn’t have been able to create our mock properly. This is something that could be changed in a refactoring, but at least mock.patch will fail in that case. You are right to be concerned: mocking is delicate. It depends on implementation details of the product code to construct the test. There are many reasons to be wary of mocks, and there are other approaches to solving the problems of isolating your product code from problematic dependencies. If you do use mocks, at least now you know how to make them work, but again, there are other approaches. See the links at the top of this page. #python #testing» 5 reactions Source: Ned Batchelder: Why your mock doesn’t work

    Read at 03:39 pm, Aug 13th

  • Cabán Final Statement - Google Docs

    FOR IMMEDIATE RELEASE Tuesday, August 6, 2019 Contact: queens.electoral@socialists.nyc STATEMENT ON THE OUTCOME OF THE DEMOCRATIC PRIMARY FOR QUEENS DA The New York City chapter of the Democratic Socialists of America would like to thank the hundreds of members throughout the city who volunt... Source: Cabán Final Statement – Google Docs

    Read at 01:21 pm, Aug 13th

  • Automattic Acquires Tumblr, Plans to Rebuild the Backend Powered by WordPress – WordPress Tavern

    Automattic has acquired Tumblr, a long-time friendly rival company, for an undisclosed sum. Just six years after Yahoo acquired Tumblr for $1.1 billion, the company is said to have been acquired for “a nominal amount” from Verizon, who indirectly acquired Tumblr when it bought Yahoo in 2017. Automattic CEO Matt Mullenweg declined to comment on the financial deals of the acquisition, but a source familiar to Axios puts the deal “well south of $20 million.” Tumblr is Automattic’s biggest acquisition yet in terms of product users and employees gained. The microblogging and social networking website currently hosts 475.1 million blogs, for which Automattic will now assume operating costs. All 200 of Tumblr’s employees will be moving over to Automattic, bringing the company’s total employee count over 1,000. Mullenweg took to the Post Status community Slack channel for an impromptu Q&A this afternoon where he discussed more of Automattic’s plans for Tumblr. He outlined a brief roadmap for Tumblr’s future that includes re-architecting its backend with WordPress: Move infrastructure off Verizon Support same APIs on both WP.com and Tumblr Switch backend to be WP Open source Tumblr.com client similar to Calypso “WordPress is an open source web operating system that can power pretty much anything, including Tumblr.com, but it’s also a large property so will take a bit to figure out and migrate,” Mullenweg said. Automattic doesn’t currently have plans to change the frontend Tumblr experience. Mullenweg said the Tumblr mobile app gets 20x more daily signups than the WordPress app. “It’s working amazingly well, despite being fairly constrained in what they can launch the past few years,” he said. Tumblr changed its adult content policy in December 2018, banning pornographic content which reportedly accounted for 22.37 percent of incoming referral traffic from external sites in 2013 when it was acquired by Yahoo. Automattic has a similar content policy in place for WordPress.com and Mullenweg confirmed that the company does not plan to lift the ban on adult content. “Adult content is not our forte either, and it creates a huge number of potential issues with app stores, payment providers, trust and safety… it’s a problem area best suited for companies fully dedicated to creating a great experience there,” Mullenweg said in response to questions on Hacker News. “I personally have very liberal views on these things, but supporting adult content as a business is very different.” Automattic’s Tumblr Acquisition Opens Up New Possibilities for E-Commerce, Plugins, and Themes Beyond this initial roadmap Mullenweg outlined, he also said he thinks “e-commerce on Tumblr is a great idea,” with simpler features developed first. In the past, Tumblr users who wanted to add e-commerce to their sites would need to use a service like Shopify or Ecwid and generate a Tumblr-compatible widget. Users would have to move to a self-hosted site on another platform in order to get more full-featured e-commerce capabilities. Automattic has the ability to build e-commerce into the platform using WooCommerce or any number of other existing solutions for simpler sales features. An emerging Tumblr/WordPress plugin and theme ecosystem is also a possibility but may not affect the wider WordPress ecosystem as much unless Automattic opens up the Tumblr marketplace to third-party developers. Mullenweg said once Tumblr’s backend is on WordPress, the idea of plugins can be explored. Whether that is on a private network, like WordPress.com, or a new breed of self-hosted Tumblr sites, is yet to be seen. Automattic’s apparent bargain basement deal on Tumblr is good news for the preservation of the open web, as the company is committed to supporting independent publishing. Migrating Tumblr’s infrastructure to WordPress also expands WordPress’ market share with a significantly younger user base. A study conducted by We Are Flint in 2018 found 43 percent of internet users between the ages of 18 to 24 years old used Tumblr. Tumblr’s primary demographic thrives on community and its current feature set is built to support that. If Automattic can preserve Tumblr’s distinct community and convenient publishing, while invisibly re-architecting it to use WordPress, users could potentially enjoy seamless transitions across platforms to suit their publishing needs. This improves the likelihood that this generation of internet users will continue to own their own content instead of tossing it away on social media silos that feed on users’ most important thoughts, writings, and memories. “I’m very excited about Tumblr’s next chapter and looking forward working with Matt Mullenweg and the entire team at Automattic,” Tumblr CEO Jeff D’Onofrio said. “I’m most excited for what this means for the entire Tumblr community. There is much more to do to make your experience a better one, and I’m super confident that we are in great hands with this news. Tumblr and WordPress share common founding principles. The plane has landed on a friendly runway. Now it is time to freshen up the jets.” In the announcement on his Tumblr blog, Mullenweg said he sees “some good opportunities to standardize on the Open Source WordPress tech stack.” This migration will undoubtedly be a formidable technical challenge and Mullenweg promised to document the team’s work after it is complete. In the meantime, the Tumblr team has new functionality they plan to introduce after the acquisition is officially closed. “When the possibility to join forces became concrete, it felt like a once-in-a-generation opportunity to have two beloved platforms work alongside each other to build a better, more open, more inclusive – and, frankly, more fun web,” Mullenweg said. “I knew we had to do it.” Would you like to write for WP Tavern? We are always accepting guest posts from the community and are looking for new contributors. Get in touch with us and let's discuss your ideas. Like this: Like Loading... Related Source: Automattic Acquires Tumblr, Plans to Rebuild the Backend Powered by WordPress – WordPress Tavern

    Read at 11:48 am, Aug 13th

  • Jeffrey Epstein Is the Face of the Billionaire Class

    The Jeffrey Epstein story is a case study of the abuses and pathologies inherent to extreme wealth. The only way to stop them is to create a world without billionaires.

    Read at 01:52 pm, Aug 13th

  • 4 Dating Apps Pinpoint Users’ Precise Locations – and Leak the Data

    Grindr, Romeo, Recon and 3fun were found to expose users’ exact locations, just by knowing a user name. Four popular dating apps that together can claim 10 million users have been found to leak precise locations of their members.

    Read at 01:37 pm, Aug 13th

  • Polls Since The Second Debate Show Kamala Harris Slipping

    Polls since last week’s Democratic debate haven’t shown the sort of dramatic swings that we saw after Round 1 — but they do show some shifts. In particular, they show further downward movement for Kamala Harris, who had already lost much of her bounce following the first debate.

    Read at 01:31 pm, Aug 13th

Day of Aug 12th, 2019

  • The Reddit Router Scam - Hackster Blog

    Why have I been blocked? This website is using a security service to protect itself from online attacks. The action you just performed triggered the security solution. There are several actions that could trigger this block including submitting a certain word or phrase, a SQL command or malformed data. Source: The Reddit Router Scam – Hackster Blog

    Read at 11:48 am, Aug 12th

  • Async Generator Functions in JavaScript

    Puppeteer is Google's official npm module for controlling Chrome from Node.js. Using Puppeteer, you can open up a Chrome browser, navigate to an arbitrary page, and interact with the page by executing arbitrary JavaScript. Here's a short list of what you can do with Puppeteer:

    Read at 10:49 pm, Aug 12th

  • Why Conspiracy Theorists Will Never Believe the ‘Official’ Epstein Story

    The reaction from the online fever swamps was predictable enough.

    Read at 10:46 pm, Aug 12th

  • Of Claps and Pronouns

    The national convention of the Democratic Socialists of America in Atlanta this August attracted a fair amount of attention, partly because the event happened as the group is growing and its favoured presidential candidate, Senator Bernie Sanders, is still doing well in the polls.

    Read at 10:43 pm, Aug 12th

  • ‘We Are Being Eaten From Within.’ Why America Is Losing the Battle Against White Nationalist Terrorism

    For decades, U.S. officials ignored the growing threat of domestic extremism. That may finally be changing When you think of a terrorist, what do you see? For more than a generation, the image lurking in Americans’ nightmares has resembled the perpetrators of the 9/11 attacks: an Islamic jihadist.

    Read at 10:39 pm, Aug 12th

  • The 3 Laws of Locality – Learn UI Design

    One of the most efficient ways to get better at design fast is through learning heuristics: that is, short rules of thumb that come up in a wide variety of situations. I’m a big believer of this.

    Read at 10:26 pm, Aug 12th

  • Denver’s City Council, Led by Democratic Socialist, Stuns For-Profit Prison Operators by Nuking Contracts

    Two for-profit prison companies have lost major contracts in Denver over their work in immigrant detention, as backlash to President Donald Trump’s immigration policy continues to mount. The stunning $10.

    Read at 10:22 pm, Aug 12th

  • Progress and its discontents

    The world has never been better. From global poverty to inequality between nations, all the indicators are showing progress. This is a comforting narrative – popularized by the likes of Bill Gates and Steven Pinker.

    Read at 10:14 pm, Aug 12th

  • Donald Trump nominates man whose firm tripled price of insulin to regulate drug companies

    Donald Trump’s pick for health secretary, Alex Azar, was previously an executive at a pharmaceutical company that repeatedly raised the prices of its drugs and tripled the cost of its top-selling insulin over the five years he served as a company president, it has emerged.

    Read at 01:27 pm, Aug 12th

Day of Aug 11th, 2019

  • Trump’s White Identity Politics Appeals to Two Different Groups

    Over the past month, President Donald Trump has embarked on a concerted push to place race at the heart of the 2020 election, first by saying that a group of four progressive congresswomen of color should “go back [to] the totally broken and crime infested places from which they came” and then w

    Read at 03:25 pm, Aug 11th

  • N-Word at the New School

    Being a good writer and a good writing teacher don’t always go hand in hand. But for poet and novelist Laurie Sheck, they do. Sheck’s been a Pulitzer Prize finalist, held prestigious fellowships, written books and seen her work published in The New Yorker, The New York Times and Paris Review.

    Read at 12:42 pm, Aug 11th

  • Is ‘Bernie or Bust’ the Future of the Left?

    The Democratic Socialists of America figure out what it means to oppose Donald Trump. ATLANTA — Three years ago, the Democratic Socialists of America had 5,000 members. Just another booth at the campus activities fair, another three-initialed group an uncle might mention over lunch.

    Read at 01:11 am, Aug 11th

Day of Aug 10th, 2019