Coinciding with the release of Bambro Koyo Ganda, Bonobo released a slew of new fall tour dates, which will see the producer and DJ perform from coast to coast across the United States and Canada. The first show of the tour falls on August 23rd at New York City’s new venue, Brooklyn Steel. From there, the musician will work his way westward, hitting Seattle’s Paramount on September 18th, which kicks off the American West Coast leg of his fall tour. After a number of dates on the West Coast, Bonobo’s fall tour will end with a string of dates across Texas before a final date at Miami’s iii Points on October 13th. You can stream Bambro Koyo Ganda below, then check out Bonobo’s upcoming fall tour dates to see if he’ll be hitting a city near you. <a href=”http://bonobomusic.bandcamp.com/album/bambro-koyo-ganda”>Bambro Koyo Ganda by Bonobo</a> The producer and DJ Bonobo (aka Simon Green) had big news for fans today, with the release of a brand new EP dubbed Bambro Koyo Ganda and the announcement of an extensive North American tour that runs from the end of August through October. Bonobo’s Bambro Koyo Ganda is a three-song release that contains a new song “Samurai” in addition to two versions of its title track including an analog version of “Bambro Koyo Ganda.” “Bambro Koyo Ganda” features Innov Gnawa, and the tune was originally released on Bonobo’s latest full album, Migration, which was released in January of this year. In April, Bonobo released a music video for “Bambro Koyo Ganda,” which you can check out below. Bonobo Fall Tour Dates:08-23 Brooklyn, NY – Brooklyn Steel08-24 Burlington, VT – Higher Ground08-25 Portland, ME – State Theatre08-26 Mount Tremblant, Quebec – Wanderlust08-29 Toronto, Ontario – Danforth08-31 Detroit, MI – Majestic Theater09-01 Chicago, IL – North Coast09-06 Aspen, CO – Belly Up Aspen09-08 Boulder, CO – Boulder Theater09-12 Edmonton, Alberta – Union Hall09-13 Calgary, Alberta – Palace Theatre09-15 Vancouver, British Columbia – Malkin Bowl09-18 Seattle, WA – Paramount09-20 Portland, OR – Roseland09-23 San Francisco, CA – Bill Graham Civic Auditorium09-27 Los Angeles, CA – Greek Theatre09-28 Pioneertown, CA – Pappy & Harriet’s10-03 Phoenix, AZ – Van Buren10-04 Tucson, AZ – Ralto10-06-08 Austin, TX – Austin City Limits10-07 Austin, TX – Emo’s10-08 San Antonio, TX – Aztec Theater10-11 Dallas, TX – House of Blues10-12 Houston, TX – House of Blues10-13-15 Austin, TX – Austin City Limits10-13-15 Miami, FL – iii Points
When she retires from her post as the Roy E. Larsen Librarian of Harvard College next month, efforts to sum up the career of Nancy Cline will invariably point to the massive, multi-year renovation of Widener Library as one of her greatest accomplishments. Such efforts, however, only scratch the surface of a career that has spanned dramatic change for Harvard’s libraries. In her 15-year tenure, Cline changed not just the physical appearance of the libraries, but the very nature of how patrons – whether students, faculty or researchers – interact with them.An early advocate for bringing the digital world inside the walls of the library, Cline helped innovate new methods of preservation through digitization and pioneered new ways of delivering library materials to users all over the world, all while continuing to deliver service to Harvard students and faculty, and the wider academic community.“If we think about the time Nancy Cline has been at Harvard, and what we thought libraries were when she arrived, and what we now understand libraries to be – it’s nothing short of a complete revolution,” said Harvard University President Drew Gilpin Faust. “Nancy has been in the forefront of that change, has enabled Harvard’s libraries to sustain a leadership role in that change, and to adapt and grow in extraordinarily transformative times.“What Harvard is, in no small part, is what its libraries are,” Faust continued. “This University is deeply dependent on its libraries, and Nancy has served on many national and international boards and committees, helping not just Harvard, but the wider world, to understand what has happened, and what is going to happen in the world of libraries. She has been a force in our digitization efforts, and in many of the other ways we have seized the future.” Read Full Story
A word of advice, oenophiles. Start stocking up now.In recent years, French vintners have produced a number of exceptional vintages, and Elizabeth Wolkovich, assistant professor of organismic and evolutionary biology, says that climate change is part of the reason why. As climate change continues to drive temperatures higher, however, that recent winning streak could soon come to an end.By examining more than 500 years of harvest records, Wolkovich and researchers from NASA found that wine grape harvests across France, on average, now occur two weeks earlier than in the past, largely because of climate change pushing temperatures higher without the aid of drought. While earlier harvests are normally associated with higher quality wines, researchers caution the trend likely won’t last forever. The study is described in a March 21 paper in Nature Climate Change.“There are two big points in this paper. The first is that harvest dates are getting much earlier, and all the evidence points to it being linked to climate change,” Wolkovich said. “Especially since 1980, when we see a major turning point for temperatures in the Northern Hemisphere, we see harvest dates across France getting earlier and earlier.“The bad news is that if we keep warming the globe, we will reach a tipping point,” she continued. “The trend, in general, is that earlier harvests lead to higher-quality wine, but you can connect the dots here … We have several data points that tell us there is a threshold we will probably cross in the future where higher temperatures will not produce higher quality.”“It’s become so warm thanks to climate change, grape growers don’t need drought to get these very warm temperatures,” said Benjamin Cook, the lead author and a climate scientist from Columbia University’s Lamont-Doherty Earth Observatory and NASA’s Goddard Institute for Space Studies. “After 1980, the drought signal effectively disappears. That means there’s been a fundamental shift in the large-scale climate under which other, local factors operate.”As a measure of how close that threshold may be, Wolkovich pointed toward 2003.That summer, she said, the continent suffered through a massive heat wave marked by record temperatures that led to many deaths. Yet despite producing the earliest harvest in the study, the wines produced were of mixed quality.Importantly, Wolkovich noted, the study is not an examination of a single vineyard, or even a single wine-growing region.“One of the strengths of this paper is that it covers all of France,” she said. “These records come from … Burgundy, Bordeaux, Loire, and even from Switzerland, so we’re looking at an aggregate of many data sets that are put together to get one picture of how a large region is changing.”What the study can’t address is what may be taking place in any single, specific vineyard.“If you are on a certain slope with a certain soil type in Bordeaux, you will see a slightly different response than if you were on a wet soil in Burgundy,” Wolkovich said. “So there is a lot of local climate that matters to when you harvest your grapes, and especially what kind of quality you get, and this type of analysis is not targeted to answering those questions about local climate.”It’s the connection between grape and the climate — what vintners call terroir — that makes wine creation one of the most ideal proxies for examining climate change.“At the heart of a good wine is climate,” Wolkovich said. “So [the grapes] are a very good canary in the coal mine … and that’s something you see in this paper, which is that temperature is the biggest driver on when you harvest wine grapes.“You want to harvest when the grapes are perfectly ripe, when they’ve had enough time to accumulate just the right balance between acid and sugar,” she continued. “For much of France, there have been times when it’s difficult to get the exact harvest date growers want because the climate wasn’t warm enough that year… but climate change means the grapes are maturing faster.”Even if the issues weren’t so tightly intertwined, wine grape records would still be invaluable.“These are some of the longest human records we have where people are actually writing down data year after year,” Wolkovich said. “Originally, it was the church that was keeping track of these records … so our data goes back to 1300, but our analysis starts at 1600.”Most climate records, Wolkovich said, cover much less than a century — after climate change had already begun — making it difficult to gain insight into how the climate system functioned in the past.“This study was possible because of the tremendous harvest records and multicentury-long records of temperature, precipitation, and soil moisture reconstructed from historical documents and tree rings,” Cook said. “We can now use this data to ask questions about whether what we understand from a record of the last 30 years is actually representative of what that system was doing 100 or 200 or, in this case, 400 years ago.”The trend that emerged when researchers began to examine that record, Wolkovich said, showed that even modest increases in temperature can result in earlier harvests.“We are looking at climate change that, on average, has warmed globally about 0.6 degrees Celsius,” she said. “But as we go forward, with projections of 2 or more degrees of additional warming, we could be talking about significantly earlier harvests.”Ultimately, Wolkovich believes, the study’s findings should serve as a lesson for how real — and immediate — climate change’s effects are.“Grapes have allowed us, because of their long-term record, to see that we have fundamentally shifted the climate system through our actions,” Wolkovich said. “We have made these extremely hot summers in Europe no longer hot and dry, but now hot and possibly humid, which is dangerous for people. But as we can see from these long-term records, there are other, cascading consequences for wine grapes as well.”
Ron: Hello Ganesh, thanks for taking the time to share your story with us. Can you tell me a little about yourself, your role and the business Datrium is in?Ganesh: I was a Principal Engineer at VMware (employee #35 if I recall correctly) in the hypervisor team. I left VMware to co-found Datrium with 4 other industry veterans from DataDomain and VMware. The Datrium founders have very different backgrounds and the company draws from this experience. I presently lead an engineering team at Datrium, but as a founder I wear many hats.Datrium is in the business of simplifying virtualization and application deployment infrastructure end to end. HCI 1.0 converged compute and primary storage, which was a good idea – but it did not consider low latency mission critical deployments, nor infrastructure for simple/efficient backups, DR orchestration, or hybrid/multi clouds from day one. Consider us HCI 2.0 – we got the opportunity to learn what is wrong with HCI 1.0, do it right, and take it further into the heart of the data center.Ron: Can you tell me about Datrium’s DVX Solution? What challenges are you helping customers solve with the platform?Ganesh: One of the biggest problems in the IT industry is complexity – there are silos and separate domains for the primary workload and backups/DR orchestration. Worse yet, the public cloud has not simplified any of this, it has become yet another management domain that the IT admin needs to worry about. HCI 1.0 simplified this some by converging compute & primary storage but still left behind a whole bunch of complexity, particularly as you scale.Datrium takes convergence much further by converging compute, primary storage, backup, DR orchestration and archival/recovery to cloud in one simple-to-manage platform. The administrator deploys one platform that handles it all with minimal data movement, very high performance, and low RTO/RPO recovery and DR orchestration. The platform can handle absolutely any primary workload and manages the data all the way through the entire lifecycle including backup, DR, and archival. A modern consumer grade UI makes all of this intuitive, simple and delightful.Ron: What technologies did you choose as part of your solution? What drove those decisions?Ganesh: What we do very well is infrastructure software that offers reliable high performance storage, data protection, orchestration, and data management services. And we are open – we work well with VMWare, RedHat, Kubernetes, AWS, you name it. We built all the software from the ground up – a distributed log-structure file system, orchestration, and GUI. Reliability, simplicity, and ease of use for the customer have been a driving force in every single design decision we made.We do offer a turnkey appliance because that is the way many of our customers prefer to consume our solution. We rely on our Dell OEM relationship to supply that turnkey solution.Ron: What were your top priorities when you were selecting an OEM to do business with?Ganesh: It came down to 3 key things:Product quality and reliability. The starting point in our mind is a product that has the highest quality and reliability standards. Dell EMC OEM sets the standard here, which made partnering with Dell EMC a no-brainer.Partnership and program. We sought an OEM that has an established feature rich program. There are many OEM vendors who can sell a product with minimal support. In Dell EMC OEM we discovered a true partner – one who has thought through the program from a supplier, buyer, and end customer perspective. Further, the Dell EMC supply chain and support solutions are second to none.Speed and ease of onboarding. We needed an OEM partner who could meet aggressive qualification and onboarding schedules. Dell EMC OEM has this figured out to a science.As a startup company we are moving fast and maximizing the resources we have at hand, but validation at scale is a challenge. We really appreciate the support from the Dell OEM team to help us take our solution to the next level. We could not have completed our 128-compute node POC without the collaboration from your team.Ron: We appreciate the collaboration. Our mission within the OEM Solutions Group is to accelerate our OEM customer’s time to market, leveraging our scale and dedicated OEM resources so you can focus on your core value, your IP, and we grow together as partners. Our focus is on building long term partnership, we think we make a great team, and we look forward to a growing our business together***For those of you reading this, I’d love to hear your thoughts or questions on how Dell EMC OEM enables our customers to be successful. Please feel free to leave a comment below or visit dellemc.com/oem. Recently, I had the opportunity to ask Datrium Co-founder Ganesh Venkitachalam a few questions about Datrium, its unique solutions, and why they chose to incorporate Dell EMC OEM technology in their appliance.
One of the key emerging technology spaces in which we are engaging with customers is the area of “Digital Twin.” At Dell Technologies, we have developed strategy, standards and solution architectures to enable customers and partners to deploy solutions that deliver digital twin capabilities within organizations. During this effort, we continue to build on the understanding that Digital Twin is not a “singular” emerging technology but rather a convergence of multiple solutions that create business outcomes and value. Some of these technologies, such as HPC, have been around a very long time. Others, such as IoT or Edge Computing, are newer and continue to gain traction within organizations. The Digital Twin Consortium, the industry standard body of which Dell Technologies is a founding member, defines a Digital Twin as “a virtual representation of real-world entities and processes, synchronized at a specified frequency and fidelity.” The purpose of this blog is to introduce many of the key technology enablers for Digital Twin. Some of these you may already have within your organization, and for others it is time to start planning for their implementation.The most recognized and arguably the most important element of a Digital Twin is a capacity for simulation or modelling. Having a capability around CAD or CAE can provide a solid foundation. In the manufacturing vertical, many companies are putting their mature capabilities for simulation and/or high-performance computing at the heart of efforts to build Digital Twins for both operational environments and product development. If an organization uses HPC platforms and simulation modelling, there will already be a strong understanding and baseline capacity on which to build. A fundamental point here is that having a simulation or HPC capability alone will not deliver a Digital Twin, in order to do that, organizations must combine it with real time data. That said, let’s segue to the Internet of Things (IoT) and edge computing.The real value in IoT, from a technical perspective, is that it can provide a great deal of data, in (near) real time, to IT systems at the edge that can then deliver insights and outcomes. By then combining data from IoT with historical datasets or simulation data, it can deliver the ultimate viewpoint on how physical environments/assets/processes are performing. Leveraging real time data can allow true “what if” scenario planning and process refinement within a Digital Twin. For many organizations deploying IoT and edge computing solutions, as well as combining IT / OT people and processes, the integration has not been straightforward; however, there is a trend today toward distributed compute architectures that will support the demand for data activities like AI and normalization nearer the point of data creation – or the edge. The time for edge computing is now – solutions architectures for edge will be the key to unlocking IoT and in turn – being an enabler for Digital Twin.Another key element of a Digital Twin is the ability to interface systems. If we have historical data from our HPC solutions and real time data from the edge, they will typically exist in very different IT architectures, applications and solution stacks. Hence, the role of APIs (Application Programming Interface) are critical. Realistically, multiple IT and OT systems will need to be able to “interconnect” and communicate. Over time a Digital Twin can and probably will become a “system of systems.” A fundamental of Digital Twin is that it will address the long running issues with technology silos and “shadow IT” that have become the bugbear of many organizations. Knowing the role of integration platforms, APIs and open source solutions are very important. Leaning into the likes of the industry consortia, such as the Digital Twin Consortium, is key to begin “connecting the dots” on these solutions.Data is absolutely the most important element across all these emerging trends. It is critical to plan for and understand what is needed to manage, move, process, integrate, store and protect the data that enables your Digital Twin. Understanding how data flows and streams within your organizations, the synchronicity of the data coming from different systems, and “where” your data lives are all fundamental. There are many data concepts being discussed in the context of Digital Twins. If you think long term about building a “unified” data platform that can address the requirements of your IT / OT datasets, it will put you on the right path.With respect to the data, key elements that can help to drive more competence into a Digital Twin are an organization’s ability to build capabilities in the areas of Data Analytics and Data Science. By layering these solutions, you can understand what your data is doing. An analytics capability allows you to inspect all elements of your data within the Digital Twin supporting your business outcomes, while solutions for machine learning and artificial intelligence will allow you to build models and automated learning systems for “problem solving” and improvements. If you already have a competency or practise for analytics or data in place, you can drive more learning and business improvement by providing all the data available within the Digital Twin to these platforms.Another crucial element that may be a little less obvious to define is how to use Digital Twins. Having the best available Client Solutions (PC’s, tablets, laptops etc) in place will make your Digital Twin easy to use, consume, interact with and refine. When we see some of the marketing relating to Digital Twin adoption, it typically will jump into the “cool” technology like AR/VR and how you can use this amazing technology to interact with simulation or “overlays” for work instructions or training. However, for many organizations having strong data visualization around KPIs can be the most important use case for getting started with Digital Twin. Having the correct compute hardware, be it PC, laptop, tablet, workstation, smartphone, or other advanced HMI technologies like headsets or haptics in place, will be critical to derive value from Digital Twins.Bear in mind, it is difficult to “buy a box” of Digital Twin. As you explore how Digital Twin can bring your organization value – understand that the above core tenets – simulation, Edge and IoT, interfaces and APIs, data platforms, Data Science, data analytics and client solutions – will be important areas to plan. Lean into existing capabilities and plan for the areas where you do not have expertise or solutions today. In future blogs, we will discuss the maturity levels of Digital Twin, some of the key partner and ecosystem solutions, industry vertical business outcomes and developments in reference architectures and solutions.At Dell Technologies we are working on solutions across all areas of Digital Twin. To learn more, please reach out to your account teams.
Notre Dame fans were in for a surprise when they entered the stadium Saturday night for the game against Michigan, where the Irish emerged victorious in a 31-0 shut out against the Wolverines.Although Notre Dame supporters came with high hopes of winning, no one expected the shutout.“I heard we were only favored by about three points or so, so I thought it was going to be a close game,” freshman Chandler Casey said.The idea of a shutout became more apparent as the game progressed.Zachary Llorens | The Observer “I was not expecting Michigan to not get any points, but after the first quarter, I was expecting we’d do really well,” sophomore and Notre Dame Marching Band member Ben Schultz said.As this was the final game between the University of Notre Dame and the University of Michigan for the foreseeable future, many students were disappointed to see the end, but were pleased with the outcome.“Last year, it was a bad game overall,” junior Liliana Sanchez said. “However, this year we ended it on our terms, and I’m really happy about that. Our house, our rules.”Freshman Quinn Brown agreed that the shut out was a great way to end the rivalry.“This being the first and last Notre Dame-Michigan game that I’ll be able to see here, that was an awesome way to go out, especially in our home stadium,” he said.Although Brown said he was sad to end the rivalry, he added that he was hopeful for the future of Notre Dame’s football games.“It’s a little sad that I don’t get to see more of these games,” he said, “But we have other great teams that we’re going to be playing that we’ve added to the schedule so it’ll be exciting to possibly see some new rivalries form.”Many students felt the band was integral to the lively atmosphere in the stadium.“I love the marching band,” Casey said. “The marching band accounts for half of the game day experience.”“[The band] always helps lead the student section chants and the victory march, which gets the crowded pumped,” Brown said. “I think they are very vital to the energy of the stadium.”Before the momentous game, the U.S. Navy SEAL Parachuting Team, Leap Frogs, parachuted into the Notre Dame Stadium. Two of the four Navy SEALs descended into the center of the field, one carrying a Notre Dame flag and the other an American flag. As the parachuters descended, both Irish and Wolverine fans were caught off-guard and in awe.“I had no idea what was going on at first,” Sanchez said. “But then when I finally realized they were going to jump and land near the stadium, I couldn’t believe it.”Sanchez said she was excited that the Notre Dame fans sang “Na Na Hey Hey Kiss Him Goodbye.”“I was planning in my head to do that, but the fact that everyone joined in at the same time was perfect,” she said.Fans from the University of Michigan expressed their opinions of the Notre Dame football experience as well.“The experience for a person who comes from [the University of Michigan] was great,” Wolverine fan Abby Schultz said. “People welcomed us to campus.”Emma Bozek-Jarvis, also a University of Michigan fan, said that not only were the ushers kind, there were “actual [Notre Dame] students as well who were very nice to us.”“I think it’s a great way to end the rivalry. I think it’s nice since it’ll get Michigan fans to be quiet for a little bit,” Ben Schultz said as he alluded to the “Chicken Dance” song that was played after Notre Dame’s loss against Michigan last year at the Big House. “We can end it with a bang and not as chickens.”Tags: football, Michigan, rivalry, shut out, Wolverines
After having graduated from Notre Dame himself in 1999, Fr. Nate Wills now resides in Keough Hall as a priest-in-residence.Wills said he wasn’t initially interested in attending Notre Dame after his older brother started at the University one year before him. “I basically wanted to go to any school but Notre Dame because I thought that was his thing,” Wills said. “Then I came to visit him sometime in the fall and totally fell in love with the place. I was really excited because it felt like home almost instantly.”He said he was particularly drawn to Old College Undergraduate Seminary.“They were asking the same questions about discernment that I was,” Wills said. “It was just a really good environment to learn and to grow in, and some of the guys who were in Old College with me at the time are still some of my closest friends.”Wills graduated with majors in theology and computer applications, and during his undergraduate career, he worked as a layout assistant at The Observer. He said his experience at The Observer contributed to his discernment about entering the priesthood.“People would just casually sit next to me and bring up questions,” Wills said. “We would get into the most interesting conversations at a really deep level and I just loved it.”He said the experience was “confirming” for him to continue having these types of conversations on a deeper level.“Putting yourself in a position of ministry sometimes invites beautiful conversations in,” Wills said.After graduating from Notre Dame, Wills entered the Alliance for Catholic Education (ACE) program and taught for two years in Chicago. He said he found a “vocation within a vocation” as a high school teacher.“[I] fell in love with the mission of Holy Cross in education,” Wills said.He finished seminary, spent four years at St. Joseph Parish in South Bend and then attended University of Wisconsin-Madison to receive a Ph.D in education in 2015. “I studied technology in education and my focus is on blended learning, which is using adaptive computer programs in the context of a traditional classroom to create a personalized learning paths for kids and to use the data that’s kicked out in those programs to make targeted interventions and really smart ability groupings for the kids,” Wills said.Wills returned to Notre Dame in 2015 where he first resided in St. Edward’s Hall before moving to Keough Hall in 2017.Upon his return, Wills worked for ACE, where he now teaches full-time for the Remick Leadership Program for aspiring Catholic school principals. As these principals are sent across the country and the world to teach, Willis spends a lot of time traveling and working remotely.He said he and two colleagues have a grant to implement blended learning research at five schools in the archdiocese of St. Paul Minneapolis. “We are working with those five schools to really lead the change of using technology for personalization in their schools so they can give kids an education befitting their dignity as children of god,” Wills said.Wills also spoke about his experience living on the fourth floor of Keough Hall. He said he had a desire to be in the life of the students and that his fourth floor room in Keough has allowed him to do this.“The way my room is situated, it’s right in the elbow of a big thoroughfare. When I’m not traveling, the guys are great about stopping in,” Wills said. “It feels like a great community. It’s been a wonderful experience for me.”Wills said the hardest part of serving as a priest-in-residence is getting to know his hall’s residents and establishing a presence in the hall.“The guys are all very welcoming and kind and often want to talk, but they just don’t always know when I’m around, so that’s been a bit of a challenge for me,” Wills said.The most rewarding part has been seeing residents move towards positions of leadership and responsibility and maturing.“The challenges of first year are real, and it’s amazing to see guys flourish within the community and bring people along,” Wills said. “I am constantly amazed at the superpowers of the kids in my dorm.”Tags: Alliance for Catholic Education, old college undergraduate seminary, priest in residence
By U.S. Naval Forces Southern Command/U.S. 4th Fleet July 22, 2020 U.S. Naval Forces Southern Command/U.S. 4th Fleet and Ecuadorean maritime planners held a virtual Initial Planning Conference (IPC) July 14-17 in support of the upcoming UNITAS LXI exercise, which will occur in November in Ecuador.Rear Admiral Daniel Ginéz, Ecuadorean Navy Fleet commander, kicked off the planning conference along with his counterpart, Rear Admiral Don Gabrielson, commander, U.S. Naval Forces Southern Command/U.S. 4th Fleet.“This team has an important mission. We are an example to our countries and the world of what cooperation means,” said Rear Adm. Gabrielson. “We are the unstoppable force and will come together to succeed even under difficult conditions.”Over 70 planners from Argentina, Brazil, Colombia, Costa Rica, Jamaica, and Peru joined representatives from the U.S. Navy, U.S. Marine Corps, U.S. Coast Guard, and U.S. Army in conducting a series of virtual meetings to refine the UNITAS LXI concept of operations, schedule of events, and assigned roles and responsibilities. All participating navies will virtually sign a memorandum to continue detailed planning and signifying the intent to participate in this year’s exercise.The U.S. and Ecuadorean navies set the stage for a successful IPC and future engagements with a passing exercise (PASSEX) with the Ecuadorean Navy designed to strengthen maritime partnerships, July 11.Sailors assigned to the USS Halsey (DDG 97) conducted the PASSEX with Ecuadorean warships BAE Manabi (CM-12) and BAE Loja (CM-16). The bilateral PASSEX, planned and executed in the COVID-19 environment, strengthened tactical readiness and practices operational command and control while signaling strategic commitments to our partners throughout the region.“This conference and the signing of the memorandum marked the beginning of the deliberate planning for UNITAS LXI. Sixty-one years of friendship, partnership, and trust cannot be overlooked. As we continue adapting to our changing world, UNITAS presents a great opportunity for participating navies to share ideas, enhance interoperability, and further strengthen relationships between our navies,” said Rear Adm. Gabrielson.UNITAS, Latin for “unity,” is the longest-running, multi-national maritime exercise in the world. Conceived in 1959, UNITAS I took place in 1960. UNITAS LXI will focus on interoperability at sea with warfighting exercises, to include live-fire events that work up to a multi-day scenario phase where participating forces come together to operate as a multi-national force.U.S. Naval Forces Southern Command/U.S. 4th Fleet supports U.S. Southern Command’s joint and combined military operations by employing maritime forces in cooperative maritime security operations to maintain access, enhance interoperability, and build enduring partnerships in order to enhance regional security and promote peace, stability, and prosperity in the Caribbean, Central and South American region.
Sign up for our COVID-19 newsletter to stay up-to-date on the latest coronavirus news throughout New York Accuweather.comA winter storm is forecast to dump 4-to-8 inches of snow on Long Island late Wednesday night into Thursday, when the precipitation is expected change into rain before switching back to snow.The National Weather Service issued a winter storm watch for much of the tri-state area, including Nassau and Suffolk counties, from midnight Thursday to 6 a.m. Friday, which is Valentine’s Day.“Snowfall will make travel treacherous on Thursday,” meteorologists in the agency’s Upton office said in a statement. “Heavy…wet snow may cause some weak…flat roof structures to collapse and trees will be susceptible to falling.”Downed trees may bring down power lines and cause outages. There may reportedly be some minor coastal flooding as well due to astronomically higher than usual tides stemming from the full moon on Friday.The flakes are forecast to start falling while temperatures are in the 20s on Wednesday night and will change to rain when temps go above freezing Thursday afternoon. Once the mercury drops back below freezing Thursday night into early Friday morning, the precipitation is expected to switch into snow again.Northeast winds of 20-to-30 mph with gusts of up to 35 mph during the storm could reduce visibility to ¼ mile at times.The chance of snow and rain lingers through Friday evening before partly sunny skies move in for the weekend, when temps will be in the 30s. The storm forecast is expected to change once it nears.