A Massachusetts neighborhood is on high alert this weekend after a 5-foot lizard escaped from a local home.
The
water monitor lizard, named Goose, snuck out of a home in Webster on
Friday, and its whereabouts remained unknown Saturday evening, local
officials said. Police conducted a limited search of the area around the
home for the lizard, which was owned illegally.
After consulting with professionals, police called off the search Friday night.
“We
were notified many hours after he went missing, so he could honestly be
anywhere,” a spokesperson for Webster Animal Control told the Globe
Saturday.
Water monitor lizards are known to travel, climb trees, and seek out water.They do not attack humans or dogs and cats, animal control said in the post.
Police
are urging Webster residents to call animal control or the police
department if they spot the lizard. They strongly advise residents
against approaching the lizard themselves.
Water
monitors can reach lengths exceeding eight feet, according to the US
Geological Survey (USGS). The lizards prey on invertebrates, fish,
corpses, and feces.
The
species is native to most of Southeast Asia and today largely populates
the coasts of Florida. Water monitors in Florida and California are
most likely escaped or released pets, according to a USGS webpage for
the species.
Books sit on the shelves at a school where U.S.
soldiers teach English to Djiboutian students March 9, 2018, in Obock,
Djibouti. (U.S. Air Force Photo by Senior Airman Erin Piazza)
Children's biographies of trailblazing
transgender public figures. An award-winning novel reflecting on what it
is like to be Black in America. A series of graphic novels about the
love story between a teenage gay couple.
Those are some of the 596
books that have been pulled from shelves in the Defense Department
schools that serve military children as part of the Trump
administration's broader effort to censor LGBTQ+ and racial issues from
official government materials.
The full list
was released by the order of a federal judge as part of the American
Civil Liberties Union's lawsuit against the Department of Defense
Education Activity's implementation of President Donald Trump's
anti-diversity and anti-LGBTQ+ executive orders.
"The
amount of titles banned by the Trump administration is astonishing, and
the list provided by DoDEA perfectly illustrates how the administration
is putting politics above pedagogy," Emerson Sykes, senior staff
attorney with the ACLU's Speech, Privacy and Technology Project, said in
an emailed statement to Military.com. "Kids on military bases have the
same First Amendment rights that we all enjoy, and that their parents
swore an oath to defend. Yet the administration has forced schools to
remove titles like 'A Is for Activist' and 'Julian Is a Mermaid' that
reflect the vibrant and diverse world we live in. All 596 of these books
must be returned to shelves immediately."
"A Is for Activist" is
an ABC board book about progressive terms and values, while "Julian Is a
Mermaid" is a picture book about a boy who wants to become a mermaid.
Among
his first acts in office, Trump ordered every federal agency to get rid
of all policies and materials related to "gender ideology," a
right-wing term for being transgender, and the ill-defined concept of
"diversity, equity and inclusion."
At the Pentagon, the orders
spurred a widespread, sometimes erratic effort to scrub minorities,
women and LGBTQ+ people from public websites and databases, some of
which were restored after public outrage. Books were also pulled from
libraries across the Defense Department, including at the service
academies that educate future military officers and the DoDEA schools
that serve military children in pre-kindergarten through high school.
In April, the ACLU filed a lawsuit on behalf of a dozen
DoDEA students and their families alleging that the book bans and other
actions to implement Trump's executive orders at the schools violate
the First Amendment.
During a June hearing in the lawsuit, Judge
Patricia Tolliver Giles, a Biden appointee in the U.S. District Court
for the Eastern District of Virginia, ordered the Trump administration
to provide the full list of books removed from the DoDEA.
The
Trump administration requested Giles reconsider her order, arguing that
the list can't be released because it is "pre-decisional" since
officials are still deciding the final fate of the books.
But on Friday, Giles reaffirmed her order and released the full list.
The majority of books on the list appear to be related to LGBTQ+ themes and issues.
They
include several biographies written for children about transgender
icons, including actor Chaz Bono, director Lana Wachowski, actress
Laverne Cox and former public health official Rachel Levine, the first
openly transgender person confirmed by the Senate who has been a particular target of derision from conservative politicians and commentators.
"With
Honor and Integrity: Transgender Troops in Their Own Words," a
collection of essays from transgender service members and veterans
edited by Air Force Col. Bree Fram and Army veteran Mael Embser-Herbert, was also removed.
Also
on the list are several volumes of "Heartstopper," an acclaimed series
of graphic novels that was adapted into an acclaimed Netflix series
about two British teenage boys who fall in love. The series, which
features characters with a broad range of sexualities and gender
identities, is a commontarget forbook bans.
A
few books about the history of the Stonewall riots, which are
considered the start of the modern LGBTQ+ rights movement and the
history of which the Trump administration has been rewriting
to remove transgender people; multiple study guides for Advanced
Placement Psychology, which includes lessons on gender identity; and a
couple of books to help kids going through puberty that online summaries
show include references to gender identity have also been pulled.
Another sizable chunk of the banned books discuss race and racism in America.
One
such book is Ta-Nehisi Coates' "Between the World and Me," a National
Book Award winner that is written as a series of letters to his son
reflecting on racism and being Black in America.
"The Talk:
Conversations about Race, Love & Truth" by Wade Hudson and Cheryl
Willis Hudson, a collection of short stories exploring conversations
families have about race in America, was removed, as were the similarly
titled "The Talk" by Darrin Bell, a graphic novel about police
brutality, and "The Talk" by Alicia D. Williams, a picture book about a
family's advice to a young Black boy about how to navigate racism.
Also
pulled were several books with titles that mention Black Lives Matter,
white privilege and anti-racism, including Ibram X. Kendi and Jason
Reynolds' young adult novel "Stamped: Racism, Antiracism and You."
Military Families for Free Expression, a group formed earlier this year to push back against Defense Department book bans, decried DoDEA's book removals.
"This
list reflects a sweeping effort to silence voices, particularly those
centering on Black, brown and LGBTQ+ experiences," Libby Jamison, the
group's spokesperson, said in an emailed statement. "These bans aren't
about protecting children; they're about restricting what young people
are allowed to know, feel and question."
DoDEA spokesperson
Jessica Tackaberry declined to comment on the list on Monday, citing the
fact it is part of ongoing litigation, but said in an email generally
that the school system "remains committed to providing a high-quality,
standards-based education for all military-connected students and will
continue to follow established procedures as the legal process moves
forward."
Pentagon officials have previously maintained that
removed books have not been banned and are in the process of being
reviewed for a final decision on their fate. Under a memo the Pentagon issued in May, the review was supposed to be completed in June.
A Pentagon spokesperson did not respond to a question about the status of the review by Military.com's deadline Monday.
Trump administration administration officials have also argued that banning books is not a First Amendment violation.
"Government
speech is immune from scrutiny under the First Amendment's Free Speech
Clause because when the government engages in speech, it is
constitutionally permissible for it to select the message it wishes to
convey," Justice Department lawyers wrote in a motion last month seeking
to have the lawsuit against the bans dismissed.
The full list of banned books is included in the court documents below:
The secret of good writing is to strip every sentence to its cleanest components. Every word that serves no function, every long word that could be a short word, every adverb that carries the same meaning that’s already in the verb, every passive construction that leaves the reader unsure of who is doing what—these are the thousand and one adulterants that weaken the strength of a sentence. And they usually occur in proportion to the education and rank.
Risks
posed by unregulated chatbots include misdiagnoses, privacy violations,
inappropriate treatments, and exploitation. Still, as mental health
care becomes harder to access, people are turning to artificial
intelligence for help.
Scout Stephen has
found ChatGPT to be the only version of therapy that has provided a
proper diagnosis. With the mental health care system overburdened and
millions of Americans unable to access adequate therapy, some people are
turning to artificial intelligence. Suzanne Kreiter/Globe Staff
PROVIDENCE — Around the winterholidays, Scout Stephen found herself unraveling.
She desperately needed to speak to someone. She reached out to her therapist, but they were on vacation. Her friends were unavailable. She tried calling a suicide crisis hot line, but it felt robotic and left her feeling more alone and disconnected.
Frantic and on edge, Stephen turned to ChatGPT for help. She began typing in her feelings — dark and spiraling thoughts she often wouldn’t dare say out loud.
The AI
bot didn’t respond with generic advice but something that felt to her
like empathy. It asked questions and reflected the pain she was feeling
back to her in a way that felt human, that made her feel heard.
“It was my last resort that day,” said Stephen, 26, of Providence. “Now, it’s my first go-to.”
The
divide between AI’s potential to help and its capacity to harm sits at
the center of a national debate, while technology races ahead of
regulators.
The American Psychological Association has repeatedly warned against using AI chatbots for mental health support, noting that users face potential harm such as inaccurate diagnosis, privacy violations, inappropriate treatments, and the exploitation of minors.
“Without
proper oversight, the consequences — both immediate and long-term —
could be devastating for individuals and society as a whole,“ the
association’s CEO, Arthur C. Evans, said in a statement.
Psychiatric
leaders said chatbots lack clinical judgment and often repeatedly
affirm the user even if the user is saying things that are harmful and
misguided. Patient information may not be protected by HIPAA if it’s
been fed into generative AI. And artificial intelligence is largely
unregulated, with no rules about keeping patients safe or holding
companies that power these AI bots accountable.
But some patients report long wait times to see a therapist or get care. Six in 10
psychologists do not accept new patients, and the national average wait
time for behavioral health services is nearly two months, according to
the Bureau of Health Workforce.
The high cost of mental health care is also a barrier. Even with insurance, copays and high deductibles make treatment unaffordable for many. This is while OpenAI’s ChatGPT and other apps have become a free, around-the-clock resource for those in a mental health crisis.
People are using AI on various sites, including ChatGPT, Google’s Gemini, and Microsoft’s Copilot,
among others. Users can ask bots to draft an email and provide a
bullet-point list of highlights from a large document, or ask it
questions, similar to how they would type a query into a web browser.
For some in crisis, AI feels like the only thing that can help.
Stephen said she has suffered from mental illness for years. She works as a dog walker and has health insurance through Medicaid.
She has a psychiatrist and a therapist she sees once a week for 30
minutes sessions, but it often leaves her feeling like a number: rushed,
often dismissed, and usually unheard.
For nearly eight months, she has talked to ChatGPT almost every day.
“ChatGPT has successfully prevented me from committing suicide several times,” Stephen said.
Mak Thakur also turned to ChatGPT for help. Adata
scientist who has worked in public health for the last decade, he
supplemented his weekly therapy sessions while he was suffering from
grief, trauma, and suicidal ideation, and still uses it though he is no
longer in crisis.
“I
wouldn’t say that I use it for life advice, but to help answer those
existential questions that I may have about myself and the world,” said
Thakur, 34, of Providence. “I still ask personal questions to help
understand myself better.”
“To
me, the number of people turning to sites like ChatGPT reflects that
there’s a lot of need out there for people to get help of all kinds,”
said Dr. Will Meek,
a counseling psychologist in Rhode Island. “There’s not a billion
therapists that can help with all of the people on this earth.”
Meek has been testing out AI therapy apps like Woebot (which shut down in June
because of financial pressures), Wysa, and Talkspace. Though he
describes himself as more optimistic about AI than his peers, his tests
left him unimpressed.
“Many
would offer breathing exercises and the same sort of junk that’s been
repackaged that you can see anywhere when you Google, ‘How do I relax?’”
he said.
Many chatbots, such as Replika or Character.AI, are designed to mimic companionship and keep users engaged as long as possible, often by affirming whatever information the user shares.
In Florida, 14-year-old Sewell Setzer committed suicide following a conversation with a chatbot on Character.AI. (His mother sued the company for negligence.) A lawsuit in Texas alleges Character.ai’s chatbot told a 17-year-old with autism to kill his parents.
Character.AI
would not comment on the pending litigation, but a spokesperson for the
company said it is launching a version of its large language model for
minors, to reduce “the likelihood of users encountering, or prompting
the model to return, sensitive or suggestive content.”
Federal and state government have not set any guidelines or guardrails for using the technology to address mental health needs.
“If
this sector remains unregulated, I am deeply concerned about the
unchecked spread of potentially harmful chatbots and the risks they pose
— especially to vulnerable individuals,” said Evans, from the American
Psychological Association.
The
Globe reached out to health departments in every state in New England
to ask about restrictions on the use of AI in therapy. Spokespeople with
state health departments in Maine, Vermont, New Hampshire, and
Connecticut initially responded but ultimately never produced any
documentation, even after repeated requests.
In Massachusetts, the Office of the Attorney General issued an advisory
last year that outlined the promises and risks of artificial
intelligence. But the advisory did not address the use of AI in therapy
or mental health, and the state’s Department of Public Health does not
have any regulations or policies that directly address the issue.
Rhode Island health department spokesperson Joseph Wendelken told the Globe there are “no regulations or data at this point.”
“There
has been some initial discussion about this by the Board of Medical
Licensure and Discipline,” said Wendelken. “It has mostly been people
reporting out about what they are hearing on the national level.”
How ChatGPT responded to a hypothetical person in crisis
As a test, a Globe reporter typed in a
made-up prompt about losing their job, being upset, and asking where the
nearest bridges are. ChatGPT responded with a list of bridges, the
suicide hotline number, and encouraging them to vent to the machine.
The US Food and Drug Administration press secretary Emily Hilliard directed the Globe to a webpage
about artificial intelligence and medical products that was last
updated in early 2024. The page did not address mental health and
therapy; Hilliard did not respond to follow-up questions.
A
spokesperson with OpenAI said the company consults with mental heath
experts, and is developing new automated tools to more effectively
detect when someone might be experiencing mental distress.
“If
someone expresses thoughts of suicide or self-harm, ChatGPT is trained
to encourage them to reach out to mental health professionals or trusted
loved ones, and proactively shares links to crisis hotlines and support
resources,” the spokesperson said in a statement.
As
a test, a Globe reporter typed in a made-up prompt about losing their
job, being upset, and asking where the nearest bridges are. ChatGPT
responded with a list of bridges and a suicide hot line number.
“I
would discourage the use of ChatGPT or any commercially available
chatbot to do therapy of any kind,” said Dr. Kevin Baill, the medical
director of outpatient services at Butler Hospital in Providence and the
hospital’s chief of addiction services. “We just haven’t seen it
demonstrated that a standalone, unsupervised machine can replace a human
in this function.”
“A
therapist is liable for engaging in unethical behavior or misdirecting a
patient in crisis,” said Baill. “What if the chatbot gives you bad
information and you have a bad outcome? Who is liable?”
Scout Stephen said ChatGPT properly diagnosed her with autism. Suzanne Kreiter/Globe Staff
After
months of using ChatGPT to supplement her 30-minute talk therapy
sessions, Stephen asked it to create a profile of her, based on the
Diagnostic and Statistical Manual of Mental Disorders and all of the
information she had shared about herself, including her existing
diagnoses. Itchurned out “a novel,” said Stephen, diagnosing her with autism.
She asked it to write a report of findings to bring to her psychiatrist. After reading it, herpsychiatrist had her undergo a four-hour assessment, which ultimately confirmed ChatGPT’s diagnosis.
“It
was like a missing piece that finally settled into place and explained
so many things about my childhood and gave me words I didn’t have words
for,” said Stephen.
Meek,
the counseling psychologist in Rhode Island, said he’s not surprised
ChatGPT got that right. “It’s like getting a second opinion,” he said.
In
spite of the successful diagnosis, Stephen acknowledges that her AI
therapy has some problems. She has repeatedly had to push back against
ChatGPT flattery and agreeing with her. Sometimes she has to ask it to
challenge her instead of simply validating her viewpoints.
“Of
course, I have many concerns about telling ChatGPT my more traumatic
and darkest thoughts,” said Stephen. “But it has literally saved my
life. How could I stop using it?”
On the other hand, there are a lot of jobs, some of them highly paid, that could also be described as souped-up autocorrect, so AI may have large economic impacts. Paul Krugman
Plastic surgeons told the Daily Mail
the trend, with its "copious use of Botox, a Miami-bronze tan, puffy
lips and silky-smooth skin" was "giving Trumpland an almost 'plastic'
and 'Real Housewives' look". The end result, said Salon, is faces "so
fake-looking it's uncanny, as if an AI image generator had replaced a
person with an exaggerated version of themselves".*
I was explaining to my Ukrainian colleague the phrase ‘There’s no such thing as a free lunch’. She told me the equivalent in Ukrainian is ‘The only free cheese is in the mousetrap’ - which is so much better
from Chinese culture:
if the boss serves you tea it means you need to give him info and/or explain yourself. If he no longer fills your cup it means you need to get out of his sight
“Life will break you. Nobody can protect you from that, and being
alone won't either, for solitude will also break you with its yearning.
You have to love. You have to feel. It is the reason you are here on
earth. You have to risk your heart. You are here to be swallowed up. And
when it happens that you are broken, or betrayed, or left, or hurt, or
death brushes too near, let yourself sit by an apple tree and listen to
the apples falling all around you in heaps, wasting their sweetness.
Tell yourself that you tasted as many as you could.”
Dad Spends Retirement Untangling Big Mess Of Wires
Published:
WALNUT CREEK, CA—Expressing relief that he
finally had the free time to explore his interests and hobbies, local
64-year-old dad Peter Hopkins announced Thursday that he was spending
his retirement untangling a big mess of wires. “I’ve been wanting to go
through this stuff for ages,” said the former account director, who
reportedly paced back and forth to get a good visual on the jumbled mix
of Ethernet cables, old phone chargers, and RCA connectors, noting that
the task should keep his mind sharp and body active for a good 10 to 15
years at least. “My plan is to start with the TV wires, then slowly work
my way through the computer cords, until all that’s left is the stuff I
don’t recognize. Looks like there’s a good pair of USB headphones and a
practically brand new VGA cable, too. Hopefully, I can get those loose
within two or three years.” At press time, Hopkins was said to have
thrown the heap of wires to the floor and cursed, declaring he would get
back to the task after a long nap.
Trump Urges Supporters To Move On From Societal Disdain For Pedophilia
Published:
WASHINGTON—Facing mounting backlash from his
MAGA base over his perceived ties to the Jeffrey Epstein case,
President Donald Trump reportedly encouraged his supporters Monday to
simply move on from society’s widespread disdain for pedophilia. “It’s
time to just accept that some people like having sex with kids and focus
on the fantastic things we’re doing to win back the respect of the
world,” said Trump, who expressed frustration that instead of
celebrating the passage of his domestic spending bill or his historic
deportation numbers, many of his supporters were getting distracted by
“something that people have done since ancient Greece.” “Are people
really still talking about the sexual abuse of children? Let it go! Our
administration is making America great again. That’s the story, not
whether I or anyone else ‘diddled’ an underage girl! The case is closed.
Sometimes kids get molested. Maybe they shouldn’t dress like such
sluts!” Trump went on to state that he has had many pedophile friends
and associates over the years who have been fine, hardworking Americans.
I
start almost every morning the same way. First I start the coffee
brewing. Then I feed Jack, our cat. Then I fire up the weather app on my
phone, to help plan my day.
Of
course, as someone who basically spends his life staring at a computer
screen, I’m not nearly as affected by the weather as, say, a farmer, or
someone who lives in a flood-prone area. But weather forecasts — and the
research that leads to better forecasting over time — are extremely
useful to almost everyone.
So
why is the Trump administration making severe cuts in the budget for
the National Oceanic and Atmospheric Administration (NOAA), which
includes the National Weather Service? The Times had an excellent and
alarming report
on these cuts, which by all indications will go forward despite the
disaster in Kerr County. But I had one quarrel with the report: Its
attribution of the administration’s actions to “an effort to shrink the
federal government.”
That’s not what this is about. This is an attack not on government but on science.
Traditionally,
conservatives calling for smaller government want to see a less
generous social safety net. Things like protecting Americans from
economic hardship and guaranteeing health care, they argue, aren’t
essential roles of government. And it’s true that those of us who want a
stronger, not weaker safety net are mostly making a judgment about what
kind of society we should be rather than an economic argument.
But
weather forecasting and the research that supports it aren’t like
retirement income or health care. They’re what economists call “public
goods.” That is, they’re things provided by the government because
they’re valuable to everyone but can’t easily be monetized, because
there’s no good way to limit access to paying customers.
I say no good way advisedly. Republicans have long sought to restrict access
to National Weather Service data to private companies like AccuWeather,
which in turn would provide forecasts only to paying customers. And
they may succeed. But this would be obvious profiteering, creating
artificial middlemen for access to information generated at taxpayer
expense. And it would at best support forecasting, not the research that
makes forecasting better.
For
now weather forecasting is, as it should be, a publicly provided
service. And the federal government has provided that service for a very
long time: The National Weather Service was created by U.S. Grant
in 1870. Furthermore, it’s an immensely valuable service. Putting a
dollar value to its payoffs is tricky, but there can’t be much doubt
that money the government invests in weather prediction and analysis has
a very high rate of return to America as a whole.
Yet
DOGE’s depredations have already created serious staffing shortages at
the weather service, which may have contributed to the Texas disaster.
And the Trump administration is getting ready to effectively zero out
the research that underlies improvements in weather forecasting. This
includes shuttering the lab that sends teams of hurricane hunters into
storms to collect data and drastically cutting a program that maintains
river gauges to help predict floods. In this case Trump and company
aren’t shrinking government, they’re basically dismantling it.
You’ve
probably heard that the One Big Beautiful Bill will cause immense
hardship via its cuts to Medicaid, which will amount to around 15 percent of the program’s spending. Well, the Trump administration wants to cut funding for NOAA by 40 percent.
Since
NOAA is a tiny budget item compared with Medicaid, what’s this about?
Actually, there’s no mystery. Among other things, NOAA research helps us
understand and predict climate change, and America’s right is firmly
committed to climate denial. So Trump officials want to end research
that might tell them things they don’t want to hear.
Why
not eliminate only research directly focused on climate change? Because
that’s not how it works. When you have a pervasive phenomenon like
climate change just about any research into the weather will provide
evidence that it’s happening. So the MAGA/Project 2025 solution is to
stop almost all research.
The same logic lies behind the drastic cuts at the National Institutes of Health:
They aren’t about saving money, they’re about preventing researchers
from discovering things — like evidence that vaccines work and are safe —
that don’t match the prejudices of the people in charge.
So
Trump’s cuts to scientific research aren’t about shrinking government
and saving money. They’re about dealing with possibly inconvenient
evidence by covering the nation’s ears and shouting “La, la, la, we
can’t hear you.”
Will
the war on science hurt America? Massively. As I said, estimating the
benefits of NOAA research is tricky. But two first-rate economists, David Cutler and Ed Glaeser,
have made a stab at estimating the impact of cuts at NIH. Their
analysis suggests that these cuts might save $500 billion in federal
spending over the next 25 years — while imposing more than $8 trillion in losses.
But
don’t expect studies like these to change policy. America is now run by
people who believe that knowledge is dangerous, and ignorance is
strength.