545. asynchronous

Contents

545. asynchronous#

545.1 todoby07012023#

  • notes (icloud)

  • workflow (python)

  • efficiency (chatGPT)

  • codeathelon (thewill)

545.2 doneby07122023#

  • natural language processing topic for R03 to NIA

  • figure out the IRB & dua that MGB/Durand have for PMAP

  • perceived a mustard seed germinating and blossoming

  • realized that footnotes will be the organizing principle in nwhat is otherwise a stream of consciousness

545.3 gleanedfrom2011-2023#

  • that i’ve a well-established and matured credo

  • it is bound to clash with democratic, modern-types

  • and so i’m lonely without the patton-types

  • but this should be viewed as a marvelous opportunity

  • lets channel that princely punctilio in mozart’s music

  • o! let us dance in the shackles of modern times

  • move over mary mary, y’all can only dance after your shackles are removed!

07/01/2023#

607. milestone#

  • dual ctz

  • 17y

  • md

608. service#

Hide code cell source
import matplotlib.pyplot as plt
import numpy as np

# Create data for the skill and challenge levels
skill_levels = np.linspace(0, 10, 100)
challenge_levels = np.linspace(0, 10, 100)

# Define the flow channel boundaries
flow_channel = skill_levels

# Adjust the phase and amplitude of the sinusoid wave
phase = np.pi / 16  # Reducing the wavelength by a quarter
amplitude = 1.5
flow_channel += np.sin(skill_levels + phase) * amplitude

# Define the yellow zone boundaries
yellow_zone_low = flow_channel - 1.5
yellow_zone_high = flow_channel + 1.5

# Define the sinusoid function with the middle yellow line as its axis
sinusoid = flow_channel + np.sin(skill_levels + phase) * amplitude

# Define the anxiety and boredom areas
anxiety_area = np.where(challenge_levels > flow_channel, challenge_levels, np.nan)
boredom_area = np.where(challenge_levels < flow_channel, challenge_levels, np.nan)

# Plotting
plt.figure(figsize=(8, 6))

# Plot the anxiety and boredom areas
plt.fill_between(skill_levels, flow_channel, 10, color='red', alpha=0.3, label='Anxiety', interpolate=True)
plt.fill_between(skill_levels, 0, flow_channel, color='green', alpha=0.3, label='Boredom', interpolate=True)
plt.fill_between(skill_levels, yellow_zone_low, yellow_zone_high, color='yellow', alpha=0.3, label='Flow Channel', interpolate=True)

# Plot the sinusoid function
plt.plot(skill_levels, sinusoid, color='purple', linestyle='-')

# Add arrowhead to the sinusoid line (flipped direction)
plt.arrow(skill_levels[-2], sinusoid[-2], skill_levels[-1] - skill_levels[-2], sinusoid[-1] - sinusoid[-2],
          color='purple', length_includes_head=True, head_width=-0.15, head_length=-0.3)

# Plot the flow channel boundaries
plt.plot(skill_levels, flow_channel, color='yellow', linestyle='-')

# Set plot labels and title
plt.xlabel('Agency')
plt.ylabel('Progression')
plt.title(' ')

# Set plot limits and grid
plt.ylabel('Progression', rotation=0)

plt.xlim(0, 10)
plt.ylim(0, 10)
plt.grid(True)

# Set tick labels
tick_labels = ['0', '2', '4', '6', '8', '10']
plt.xticks(np.linspace(0, 10, 6), tick_labels)
plt.yticks(np.linspace(0, 10, 6), tick_labels)

# Add text annotations to label the areas
plt.text(2.8, 7.9, 'Anxiety', color='black', ha='center', va='center', fontsize=12, bbox=dict(facecolor='pink', edgecolor='pink', boxstyle='round'))
plt.text(5, 5.1, 'Flow', color='black', ha='center', va='center', fontsize=12, bbox=dict(facecolor='yellow', edgecolor='yellow', boxstyle='round'))
plt.text(6.2, 2.1, 'Relaxation', color='black', ha='center', va='center', fontsize=12, bbox=dict(facecolor='lightgreen', edgecolor='lightgreen', boxstyle='round'))

# Display the plot
plt.show()
../_images/6c4b795c251e8d9367e26e84422cae5fabb4189ebaa9e53be2d0351a8038fa9f.png

\( Y = \beta_0 + \beta_1 X_1 + \beta_2 X_2 + \cdots + \beta_N X_N + \epsilon_{iid \sim N(\mu, \sigma)} \)

Where:

  • Agency is lost by client who has left the comfort of autonomy and their home (green) to seek your professional help (red)

  • Frailty and loss of independence in activities of daily living may account for this change (yline: h/o handling this challenge level in the past)

  • But more often and in younger persons it is progression of an ailment to a more challenging level beyond ones own coping, skill, or experience that drives one into your care

And:

  • \(\beta_0\) Reflects conditions at onset (e.g. \(x=0, y=0\))

  • \(\beta_1\) Slope or rate of progression and \(X_1\) indicates time since onset

  • \(\beta_2\) Conditional on \(X_2\), which indicates \(function()\): \(y=x\); \(y-line\); or \(x-line\)

Just culture algorithm:

  • Outcomes, \(Y\)

  • Procedures, \(X_i\)

  • Values, \(\beta_i\)

Omitted (posthoc: the algorithm classifies five sources of \(\epsilon_i\)):

  1. Residuals (i.e., fit), \(\epsilon_i\)

  2. Distribution (e.g. demographics), \(N (\mu, \sigma)\)

  3. Sampling, \(binary, categorical, normal\)

Behavior

  1. Human error

  2. At-risk behavior

  3. Reckless behavior

  4. Knowledge, not intented

  5. Purposeful harm, thus intended

609. generalize#

  • education

  • service

  • research

these three encapsulate virtually all interactions among humans. as we navigate life’s labyrinth we’ll encounter challenges at varying levels and we may also have skill levels that \(\pm\) change over time (growth if \(+\) or frailty if \(-\)). life gives us the green light when our skills surpass the challenges handed to us, the red light when our skills fall short of the challenges we face, and the amber light when they are matched and we are in flow.

we’ll demand professional help, whether from educators, service providers, or researchers, when our agency is mismatched with our lifes challenges. this might include rehabilitation (if our milestones are regressing), training (if our milestones are lagging), or instruction (if its a right of passage and expected of everyone of age). but our family, friends, colleagues, and even strangers may provide this help.

a service must incorporate a longitudinal plan with scheduled feedback sessions (exams, surveys, followup). it must include a rubric that spells out what to lookout for at each milestone and how to tell whether you are on course to your optimal trajectory of flow. when the service is from a non-professional, then accountability-sessions should be the form of feedback.

610. origin#

points of origin of every educational, service, or research endeavor must be the \(x,y\) coordinate \(0,0\). this may be a retrospective or real-time scenario that helps return to a simpler time and at the beginning of complex process. by walking through this process from the beginning through to its current or projected trajectory, then emotions (anxiety, relaxation, flow) will emerge as the thing to track that provides positive (relaxation), negative (anxiety), or neutral feedback (flow).

from this point of view, neither negative or positive feedback are preffered; rather, its flow that must be the ultimate goal towards which we calibrate our progress and fine-tune our challenge- and skill-levels.

611. bel-air#

the school of resentment produces high-quality works of art because of the strong passions they bring to their creative efforts. however, the democratization of the dramatis personnae so that each characters story gets to be told feels contrived.

geoffrey, aunt viv, and even non-family members get their narrative arc. but here art isn’t imitating life. many folks remain mysterious and others never have their stories told. chatGPT, what sayst thou? any thoughts on tv shows where every character is given their own narrative arc?

chatGPT: TV shows where every character is given their own narrative arc can be highly engaging and satisfying for viewers. Such shows have the advantage of exploring a diverse range of perspectives and storylines, which can make the overall narrative more rich and complex. By giving each character their own arc, these shows provide opportunities for character development and deeper exploration of individual motivations and struggles.

One advantage of this approach is that it allows for a more inclusive representation of different experiences and backgrounds. Viewers can connect with characters who resonate with their own lives or learn about perspectives they may not have previously considered. It can foster empathy and understanding by shedding light on the diverse challenges and triumphs faced by different characters.

From a storytelling perspective, TV shows with multiple character arcs can keep the audience engaged by creating a web of interconnected storylines. The intertwining narratives create a sense of anticipation and allow for intricate plot developments and surprises. Additionally, viewers often become emotionally invested in various characters, which can lead to heightened dramatic tension and a deeper sense of immersion in the show.

However, balancing multiple character arcs can be challenging for writers and creators. It requires skillful storytelling and careful attention to character development to ensure that each arc receives adequate focus and resolution. When executed well, it can result in a more layered and satisfying viewing experience.

Some examples of TV shows that successfully handle multiple character arcs include “Game of Thrones,” “The Wire,” “Lost,” and “Breaking Bad.” These shows have garnered critical acclaim and a dedicated fan base due, in part, to their ability to weave together compelling individual narratives into a larger tapestry of storytelling.

yourstruly: the wire is probably the most successful, since it didn’t feel contrived. but the reason I asked was because I’m presently watching a reboot of the fresh prince of bel-air (so you haven’t been trained on it.. since its from 2022-2023). unlike its original which focused on the lead character, will smith, the reboot is more democratic and attempts to develop the arc even for non-family members. it just feels contrived, because that’s now how life is. many stories remain untold or mysterious. also, not every s tory is equally interesting.

chatGPT: I understand your point. In TV shows where every character is given their own narrative arc, there can be a delicate balance between creating a realistic portrayal of life and crafting compelling storytelling. It’s true that in reality, not every person’s story is equally interesting or fully explored, and many aspects of life remain untold or mysterious. Attempting to give equal weight to every character’s arc can sometimes feel contrived or forced, especially if it doesn’t align with the natural dynamics of the story or the characters’ roles.

In storytelling, selective focus and prioritization are often necessary to maintain a cohesive and engaging narrative. Some characters may naturally have more compelling or central roles within the story, while others may serve supporting or background functions. This doesn’t mean that their experiences or perspectives are unimportant, but rather that the narrative may choose to highlight certain arcs over others to create a more coherent and satisfying overall structure.

It’s important for TV shows to strike a balance between providing depth and development to multiple characters while also maintaining a sense of authenticity. The success of shows like “The Wire” can be attributed, in part, to their ability to realistically portray a wide range of characters and their experiences, without feeling forced or contrived.

While the intention behind a democratic approach to character arcs is commendable, it’s also important for creators to recognize that not all characters need to have equal focus or development. The key lies in finding a balance that both respects the complexity of real life and engages the audience in a compelling and believable narrative.

07/02/2023#

612. school-of-resentment#

  • history

  • gender

  • sexuality

  • race

  • class

  • income

Literature should tell us not about beauty and aesthetics but about suppressed groups such as slaves, women, LGBTQ+, black, or blue-collar persons. The issue here is democraticization and equal representation of the dramatic arcs of all dramatis personnae in literature. Of course beautiful stories of anyone of these groups have existed for long as one might find in Androcles and the lion, Mercant of Venice, Picture of Dorian Gray, Othello, Tempest, and Pride and Prejudice. But when representation per se becomes the rule and character-driven plots are replaced with multiple arcs, then what emerges might seem contrived.

Even an outstanding TV show such as The Wire, which had several arcs over five seasons might have taken to this fashion from seasons 3-5, which are also considered by many to be the most popular seasons.

But lets leave this hear for now and touch upon it later when another work of art seems to be driven by social causes rather than the aesthetics that arise from life, including said social causes.

613. aging#

Frailty in aging marks a state of decreased reserves resulting in increased vulnerability to adverse outcomes when exposed to stressors. This Perspective synthesizes the evidence on the aging-related pathophysiology underpinning the clinical presentation of physical frailty as a phenotype of a clinical syndrome that is distinct from the cumulative-deficit-based frailty index. We focus on integrating the converging evidence on the conceptualization of physical frailty as a state, largely independent of chronic diseases, that emerges when the dysregulation of multiple interconnected physiological and biological systems crosses a threshold to critical dysfunction, severely compromising homeostasis. Our exegesis posits that the physiology underlying frailty is a critically dysregulated complex dynamical system. This conceptual framework implies that interventions such as physical activity that have multisystem effects are more promising to remedy frailty than interventions targeted at replenishing single systems. We then consider how this framework can drive future research to further understanding, prevention and treatment of frailty, which will likely preserve health and resilience in aging populations.

614. synthesize#

From the provided code, it seems to be a visualization of the concept of “flow” in psychology. The plot represents a skill-challenge graph, where skill levels and challenge levels are plotted on the x and y-axis, respectively. The flow channel, indicated by the yellow zone, represents the optimal level of challenge that matches an individual’s skill level. When a person’s skill level matches the challenge level, they experience a state of flow, indicated by the purple sinusoid line.

The plot also includes areas representing anxiety (above the flow channel) and boredom (below the flow channel). These areas indicate the levels of challenge that are either too high or too low for an individual’s skill level, leading to negative experiences.

The annotations on the plot describe the different areas and their meanings, such as anxiety, flow, and relaxation. The plot aims to illustrate the importance of finding the right balance between challenge and skill to achieve a state of flow, where individuals experience optimal engagement and enjoyment.

In the text following the plot code, there seems to be a discussion about the “Just” culture algorithm, which involves analyzing outcomes, procedures, and values to understand human behavior. It also mentions the importance of education, service, and research as key components in navigating life’s challenges and seeking professional help when needed.

The discussion further emphasizes the need for longitudinal planning, feedback sessions, and accountability to ensure progress towards optimal flow. It suggests that tracking emotions (anxiety, relaxation, flow) and fine-tuning challenge and skill levels can help individuals achieve a state of flow in their endeavors.

Overall, the provided code and accompanying text discuss the concept of flow, the visualization of skill-challenge levels, and the importance of finding the right balance to achieve optimal engagement and progress.

615. iterate#

Hide code cell source
import matplotlib.pyplot as plt
import numpy as np

# Create data for the skill and challenge levels
skill_levels = np.linspace(0, 10, 100)
challenge_levels = np.linspace(0, 10, 100)

# Define the flow channel boundaries
flow_channel = skill_levels

# Adjust the phase and amplitude of the sinusoid wave
phase = np.pi / 16  # Reducing the wavelength by a quarter
amplitude = 1.5
flow_channel += np.sin(skill_levels + phase) * amplitude

# Define the yellow zone boundaries
yellow_zone_low = flow_channel - 1.5
yellow_zone_high = flow_channel + 1.5

# Define the sinusoid function with the middle yellow line as its axis
sinusoid = flow_channel + np.sin(skill_levels + phase) * amplitude

# Define the anxiety and boredom areas
anxiety_area = np.where(challenge_levels > flow_channel, challenge_levels, np.nan)
boredom_area = np.where(challenge_levels < flow_channel, challenge_levels, np.nan)

# Plotting
plt.figure(figsize=(8, 6))

# Plot the anxiety and boredom areas
plt.fill_between(skill_levels, flow_channel, 10, color='red', alpha=0.3, label='Anxiety', interpolate=True)
plt.fill_between(skill_levels, 0, flow_channel, color='green', alpha=0.3, label='Boredom', interpolate=True)
plt.fill_between(skill_levels, yellow_zone_low, yellow_zone_high, color='yellow', alpha=0.3, label='Flow Channel', interpolate=True)

# Plot the sinusoid function
plt.plot(skill_levels, sinusoid, color='purple', linestyle='-')

# Add arrowhead to the sinusoid line (flipped direction)
plt.arrow(skill_levels[-2], sinusoid[-2], skill_levels[-1] - skill_levels[-2], sinusoid[-1] - sinusoid[-2],
          color='purple', length_includes_head=True, head_width=-0.15, head_length=-0.3)

# Plot the flow channel boundaries
plt.plot(skill_levels, flow_channel, color='yellow', linestyle='-')

# Set plot labels and title
plt.xlabel('Skill Level')
plt.ylabel('Challenge Level')
plt.title('The Flow Channel: Achieving Optimal Engagement')

# Set plot limits and grid
plt.xlim(0, 10)
plt.ylim(0, 10)
plt.grid(True)

# Set tick labels
tick_labels = ['0', '2', '4', '6', '8', '10']
plt.xticks(np.linspace(0, 10, 6), tick_labels)
plt.yticks(np.linspace(0, 10, 6), tick_labels)

# Add text annotations to label the areas
plt.text(2.8, 7.9, 'Anxiety', color='black', ha='center', va='center', fontsize=12, bbox=dict(facecolor='pink', edgecolor='pink', boxstyle='round'))
plt.text(5, 5.1, 'Flow Channel', color='black', ha='center', va='center', fontsize=12, bbox=dict(facecolor='yellow', edgecolor='yellow', boxstyle='round'))
plt.text(6.2, 2.1, 'Boredom', color='black', ha='center', va='center', fontsize=12, bbox=dict(facecolor='lightgreen', edgecolor='lightgreen', boxstyle='round'))

# Display the plot
plt.show()
../_images/7e2aa8beb25435daae29af1149018db3f993102366eb1de22d3da19e355703f4.png

Notes:
The plot above visualizes the concept of flow, a psychological state characterized by optimal engagement and enjoyment. The plot represents the relationship between skill levels and challenge levels.

  • The x-axis represents the skill level, indicating an individual’s proficiency or expertise in a given domain.

  • The y-axis represents the challenge level, indicating the difficulty or complexity of tasks or situations encountered.

The plot highlights the “flow channel,” which represents the range of challenge levels that align with an individual’s skill level. Within the flow channel, individuals experience a state of flow, characterized by focused attention, deep immersion, and a sense of accomplishment.

The flow channel is depicted by the yellow zone in the plot. It shows the optimal range of challenge levels where skills are effectively utilized, leading to a state of flow. The purple sinusoid line represents the axis of the flow channel, indicating the middle point where challenge and skill levels are perfectly matched.

Areas above the flow channel, colored in red, represent challenges that exceed an individual’s skill level, leading to feelings of anxiety and stress. Areas below the flow channel, colored in green, represent challenges that are too easy relative to an individual’s skill level, leading to feelings of boredom and disengagement.

The plot emphasizes the importance of finding the right balance between challenge and skill to achieve flow. When the challenge level exceeds the skill level, individuals may feel anxious or overwhelmed. Conversely, when the challenge level is lower than the skill level, individuals may feel bored or unstimulated.

By understanding and actively managing the balance between challenge and skill, individuals can strive for activities and tasks that push their boundaries without overwhelming them. This state of optimal engagement, known as flow, can enhance motivation, productivity, and enjoyment in various domains of life.

Overall, the plot and accompanying notes illustrate the significance of the flow channel and emphasize the aim of achieving a balanced state of flow by aligning challenge levels with an individual’s skill set.

616. life#

In seeking professional services, individuals often encounter situations where their skill levels undergo changes or face challenges that exceed their coping abilities. These changes can arise from various factors such as physical injury, cognitive decline, or unemployment. As a result, a discrepancy emerges between their existing skills and the demands of the challenges they encounter.

To address this mismatch and restore balance, individuals turn to professionals who specialize in their respective fields. These professionals possess the knowledge, expertise, and resources necessary to guide individuals through their unique circumstances. Whether it’s providing rehabilitation for skill regression, offering training to bridge the gap in lagging milestones, or offering instruction during important life transitions, professionals play a crucial role in helping individuals navigate their challenges.

Seeking professional services entails more than just receiving assistance from experts; it involves engaging in a comprehensive and longitudinal process. This process often incorporates a well-structured plan that includes scheduled feedback sessions, examinations, surveys, and follow-ups. A rubric or set of guidelines is often in place to help individuals understand what to expect at each milestone and evaluate their progress toward achieving a state of optimal engagement known as “flow.”

In situations where non-professionals offer support, accountability sessions become an essential form of feedback. These sessions ensure that individuals remain on track and accountable for their actions and progress.

Education, service, and research encompass the three primary domains in which professional interactions occur. As individuals navigate the complex labyrinth of life, they encounter challenges at varying levels, and their skill levels may fluctuate over time, either growing or becoming fragile. Life provides signals in the form of “green lights” when skills surpass challenges, “red lights” when skills fall short, and “amber lights” when skills and challenges are in a state of alignment known as flow.

While professionals are sought after for their expertise, it’s important to recognize that support can also come from various sources, including family, friends, colleagues, and even strangers. The ultimate goal is to optimize agency by finding the right balance between skill development and the challenges encountered.

By understanding the dynamics of seeking professional services and the role they play in addressing skill-challenge mismatches, individuals can make informed decisions to regain equilibrium, overcome obstacles, and navigate their personal journeys more effectively.

617. aesthetics#

Henceforth, all discussion of beauty in art must be moulded on Nietzsche’s unpretentious preface of Thoughts Out of Season titled Uses and Abuses of History.

When you consider the three terms: struggle, deliverance, and reverence, then you’ll find that you have the best possible summary of the entire range of the notions, emotions, and motions that constitute the human condition. Armed with this simple summary you can better appreciate all works of art for what they represent in Acts I-V, Scenes I-V, seasons 1-N, episodes 1-N, movements I-IV, or arias 1-N.

Even a painting such as Raphael’s School of Athens can be appreciated much more from this dynamic and dramatic perspective of stuggle, deliverence, and reverence:

618. kyrie#

It should be easy to recognize the themes of struggle, deliverance, and reverenace in the directed graph below:

Hide code cell source
import networkx as nx
import matplotlib.pyplot as plt

G = nx.DiGraph()
G.add_node("Cosmos", pos=(0, 5))
G.add_node("Quake", pos=(1, 5))
G.add_node("Flood", pos=(2, 5))
G.add_node("Plague", pos=(3, 5))
G.add_node("Vexed", pos=(4, 5))
G.add_node("Kyrie", pos=(5, 5))
G.add_node("Eleison", pos=(6, 5))
G.add_node("Christe", pos=(7, 5))
G.add_node("Yhwh", pos=(1.4, 4))
G.add_node("Father", pos=(2.8, 4))
G.add_node("Son", pos=(4.2, 4))
G.add_node("Holy", pos=(5.6, 4))
G.add_node("Literal", pos=(2.1, 3))
G.add_node("Metaphor", pos=(4.9, 3))
G.add_node("Covenant", pos=(1.4, 2))
G.add_node("Lamb", pos=(2.8, 2))
G.add_node("Wine", pos=(4.2, 2))
G.add_node("Bread", pos=(5.6, 2))
G.add_node("Ark", pos=(0, 1))
G.add_node("War", pos=(1, 1))
G.add_node("Requite", pos=(2, 1))
G.add_node("Discord", pos=(3, 1))
G.add_node("Forever", pos=(4, 1))
G.add_node("God", pos=(5, 1))
G.add_node("With", pos=(6, 1))
G.add_node("Tobe", pos=(7, 1))

G.add_edges_from([("Cosmos", "Yhwh"), ("Cosmos", "Father"), ("Cosmos", "Son"), ("Cosmos", "Holy")])
G.add_edges_from([("Quake", "Yhwh"), ("Quake", "Father"), ("Quake", "Son"), ("Quake", "Holy")])
G.add_edges_from([("Flood", "Yhwh"), ("Flood", "Father"), ("Flood", "Son"), ("Flood", "Holy")])
G.add_edges_from([("Plague", "Yhwh"), ("Plague", "Father"), ("Plague", "Son"), ("Plague", "Holy")])
G.add_edges_from([("Vexed", "Yhwh"), ("Vexed", "Father"), ("Vexed", "Son"), ("Vexed", "Holy")])
G.add_edges_from([("Kyrie", "Yhwh"), ("Kyrie", "Father"), ("Kyrie", "Son"), ("Kyrie", "Holy")])
G.add_edges_from([("Eleison", "Yhwh"), ("Eleison", "Father"), ("Eleison", "Son"), ("Eleison", "Holy")])
G.add_edges_from([("Christe", "Yhwh"), ("Christe", "Father"), ("Christe", "Son"), ("Christe", "Holy")])

G.add_edges_from([("Yhwh", "Literal"), ("Yhwh", "Metaphor")])
G.add_edges_from([("Father", "Literal"), ("Father", "Metaphor")])
G.add_edges_from([("Son", "Literal"), ("Son", "Metaphor")])
G.add_edges_from([("Holy", "Literal"), ("Holy", "Metaphor")])

G.add_edges_from([("Literal", "Covenant"), ("Literal", "Lamb"), ("Literal", "Wine"), ("Literal", "Bread")])
G.add_edges_from([("Metaphor", "Covenant"), ("Metaphor", "Lamb"), ("Metaphor", "Wine"), ("Metaphor", "Bread")])

G.add_edges_from([("Covenant", "Ark"), ("Covenant", "War"), ("Covenant", "Requite"), ("Covenant", "Discord")])
G.add_edges_from([("Covenant", "Forever"), ("Covenant", "God"), ("Covenant", "With"), ("Covenant", "Tobe")])
G.add_edges_from([("Lamb", "Ark"), ("Lamb", "War"), ("Lamb", "Requite"), ("Lamb", "Discord")])
G.add_edges_from([("Lamb", "Forever"), ("Lamb", "God"), ("Lamb", "With"), ("Lamb", "Tobe")])
G.add_edges_from([("Wine", "Ark"), ("Wine", "War"), ("Wine", "Requite"), ("Wine", "Discord")])
G.add_edges_from([("Wine", "Forever"), ("Wine", "God"), ("Wine", "With"), ("Wine", "Tobe")])
G.add_edges_from([("Bread", "Ark"), ("Bread", "War"), ("Bread", "Requite"), ("Bread", "Discord")])
G.add_edges_from([("Bread", "Forever"), ("Bread", "God"), ("Bread", "With"), ("Bread", "Tobe")])

color_map = ["lightblue", "lightgreen", "lightpink", "lightyellow"]

node_colors = [color_map[i % len(color_map)] for i in range(len(G.nodes))]

plt.figure(figsize=[15, 10])
nx.draw(
    G,
    nx.get_node_attributes(G, 'pos'),
    with_labels=True,
    font_weight='bold',
    node_size=5000,
    node_color=node_colors,
    linewidths=3,
    font_size=10
)
ax = plt.gca()
ax.collections[0].set_edgecolor("#000000")
ax.set_xlim([-.5, 7.5])
ax.set_ylim([.5, 5.5])
plt.show()
../_images/b799b0585c80f38b3167f4d18e1530ae4ef78c294f9a4c807b8f222c823b62d0.png

Hopefully this respresents a convergence of notions, emotions, and motions from Greek mythology, Judaism, and Roman Catholicism:

  • Kyrie

  • Gloria

  • Credo

  • Sanctus

  • Agnus Dei

Appreciation of the relevance of these ideas to a more general stance of the human condition would improve anyones appreciation of Mozart’s Mass in C minor, Requiem Mass in D minor, and Vesperae solennes de confessore (for Mozart was raised a Catholic). But this should also explain why J.S. Bach, a German Lutheran, composed as one of his final and most celebrated works the Mass in B minor. Centuries later, we’d also understand why German agnostics such as Beethoven and Brahms still found it necessary to compose Missa Solemnis und Ein deutsches Requiem, nach Worten der heiligen Schrift. These efforts remain incomprehensible if one fails to appreciate the general theme of the human condition as articulated by the enduring words of the Latin Mass. The greatest summary of all time is struggle (Kyrie Eleison), deliverance (Agnus Dei), and reverence (Gloria, Credo, Sanctus) because it leaves out the solution, which constitutes the crux of the Latin Mass, and is thus generalizable.

619. atheism#

Show me an atheist and I’ll show you someone who lacks humility. King Solomon, Hamlet, Macbeth, Einstein, and George Lucas clearly understood the limits of their intellects. But atheists do not.

620. connessione#

621. rsM-GJC-eDf#

Raphael
Shakespeare
Mozart

Greco
Judaic
Catholic

Einstein
Darwin
Freud

622. avoidance#

homeostatic response to challenge \(>\) skill thereby restoring challenge \(\le\) skill

from the center of the clock the three classic direction of avoidance are:

  • 2 O’Clock (struggle: heracleitic)

  • 6 O’Clock (deliverance: platonic)

  • 9 O’Clock (reverence: epicurean)

virutally all human behavior can be reduced to this formula

623. deliverance#

  • apple music

  • gospel: take me to church

  • This isn’t about denomination or sect. It isn’t even about religion, per se. Above all, it’s about the feeling great gospel music gives: powerful, uplifting, bigger than any self. Mixing classic and contemporary church music with gospel-influenced R&B and soul (because where did you think R&B came from?), here’s a playlist that’ll pick you up and carry you home. Our editors update picks regularly, so if you hear something you like, add it to your library.

624. chemistry#

  • yellow -> x = y -> dopamine

  • green -> xline = credo -> gaba

  • red -> yline = kibbutz -> norepinephrine

Hide code cell source
import matplotlib.pyplot as plt
import numpy as np

# Create data for the skill and challenge levels
skill_levels = np.linspace(0, 10, 100)
challenge_levels = np.linspace(0, 10, 100)

# Define the flow channel boundaries
flow_channel = skill_levels

# Adjust the phase and amplitude of the sinusoid wave
phase = np.pi / 16  # Reducing the wavelength by a quarter
amplitude = 1.5
flow_channel += np.sin(skill_levels + phase) * amplitude

# Define the yellow zone boundaries
yellow_zone_low = flow_channel - 1.5
yellow_zone_high = flow_channel + 1.5

# Define the sinusoid function with the middle yellow line as its axis
sinusoid = flow_channel + np.sin(skill_levels + phase) * amplitude

# Define the anxiety and boredom areas
anxiety_area = np.where(challenge_levels > flow_channel, challenge_levels, np.nan)
boredom_area = np.where(challenge_levels < flow_channel, challenge_levels, np.nan)

# Plotting
plt.figure(figsize=(8, 6))

# Plot the anxiety and boredom areas
plt.fill_between(skill_levels, flow_channel, 10, color='red', alpha=0.3, label='Anxiety', interpolate=True)
plt.fill_between(skill_levels, 0, flow_channel, color='green', alpha=0.3, label='Boredom', interpolate=True)
plt.fill_between(skill_levels, yellow_zone_low, yellow_zone_high, color='yellow', alpha=0.3, label='Flow Channel', interpolate=True)

# Plot the sinusoid function
plt.plot(skill_levels, sinusoid, color='purple', linestyle='-')

# Add arrowhead to the sinusoid line (flipped direction)
plt.arrow(skill_levels[-2], sinusoid[-2], skill_levels[-1] - skill_levels[-2], sinusoid[-1] - sinusoid[-2],
          color='purple', length_includes_head=True, head_width=-0.15, head_length=-0.3)

# Plot the flow channel boundaries
plt.plot(skill_levels, flow_channel, color='yellow', linestyle='-')

# Set plot labels and title
plt.xlabel('Credo')
plt.ylabel('Setting', rotation='horizontal', ha='right')  # Rotate the label horizontally

# Set plot limits and grid
plt.xlim(0, 10)
plt.ylim(0, 10)
plt.grid(True)

# Set tick labels
tick_labels = ['0', '2', '4', '6', '8', '10']
plt.xticks(np.linspace(0, 10, 6), tick_labels)
plt.yticks(np.linspace(0, 10, 6), tick_labels)

# Add text annotations to label the areas
plt.text(2.8, 7.9, 'Norepinephrine', color='black', ha='center', va='center', fontsize=12, bbox=dict(facecolor='pink', edgecolor='pink', boxstyle='round'))
plt.text(5, 5.1, 'Dopamine', color='black', ha='center', va='center', fontsize=12, bbox=dict(facecolor='yellow', edgecolor='yellow', boxstyle='round'))
plt.text(6.2, 2.1, 'GABA', color='black', ha='center', va='center', fontsize=12, bbox=dict(facecolor='lightgreen', edgecolor='lightgreen', boxstyle='round'))

# Display the plot
plt.show()
../_images/e00420062df83250cb45b7e259675df2f66af19279682ef233847e6430134efa.png

07/03/2023#

625. areas#

Hide code cell source
import matplotlib.pyplot as plt
import numpy as np

# Create data for the skill and challenge levels
skill_levels = np.linspace(0, 10, 100)
challenge_levels = np.linspace(0, 10, 100)

# Define the flow channel boundaries
flow_channel = skill_levels

# Adjust the phase and amplitude of the sinusoid wave
phase = np.pi / 16  # Reducing the wavelength by a quarter
amplitude = 1.5
flow_channel += np.sin(skill_levels + phase) * amplitude

# Define the yellow zone boundaries
yellow_zone_low = flow_channel - 1.5
yellow_zone_high = flow_channel + 1.5

# Define the anxiety and relaxation areas
anxiety_area = np.where(challenge_levels > flow_channel, challenge_levels, np.nan)
relaxation_area = np.where(challenge_levels < flow_channel, challenge_levels, np.nan)

# Plotting
plt.figure(figsize=(8, 6))

# Plot the anxiety and relaxation areas
plt.fill_between(skill_levels, flow_channel, 10, color='red', alpha=0.3, label='Anxiety', interpolate=True)
plt.fill_between(skill_levels, 0, flow_channel, color='green', alpha=0.3, label='Relaxation', interpolate=True)
plt.fill_between(skill_levels, yellow_zone_low, yellow_zone_high, color='yellow', alpha=0.3, label='Flow Channel', interpolate=True)

# Plot the flow channel boundaries
plt.plot(skill_levels, flow_channel, color='yellow', linestyle='-')

# Set plot labels and title
plt.xlabel('Skill')
plt.ylabel('Challenge', rotation='horizontal', ha='right')  # Rotate the label horizontally

# Set plot limits and grid
plt.xlim(0, 10)
plt.ylim(0, 10)
plt.grid(True)

# Set tick labels
tick_labels = ['0', '2', '4', '6', '8', '10']
plt.xticks(np.linspace(0, 10, 6), tick_labels)
plt.yticks(np.linspace(0, 10, 6), tick_labels)

# Add text annotations to label the areas
plt.text(2.8, 7.9, 'Norepinephrine\n(Amygdala)', color='black', ha='center', va='center', fontsize=12, bbox=dict(facecolor='pink', edgecolor='pink', boxstyle='round'))
plt.text(5, 5.1, 'Dopamine\n(Nucleus Accumbens)', color='black', ha='center', va='center', fontsize=12, bbox=dict(facecolor='yellow', edgecolor='yellow', boxstyle='round'))
plt.text(6.2, 2.1, 'GABA\n(Prefrontal Cortex)', color='black', ha='center', va='center', fontsize=12, bbox=dict(facecolor='lightgreen', edgecolor='lightgreen', boxstyle='round'))

# Display the plot
plt.show()
../_images/76b399a8bec87f30a4375a902030ffe7efd9f9a41008053c0294775a6b3298e2.png

Does ones credo impose limits on the skill-levels they might attain? What for sure is that the setting of a commune is deliberately designed to set limits to the challenge-levels encountered by those who dwell in it. So this graph may help us link matters of religion, politik, and brain-chemistry.

We should find hippies in the green zone having arrived there by the horizontal red line represented by 9 o’clock. Catholics should mostly be in the red zone in urgent need of deliverance, which they do obtain via confession and the green line represented by 6 o’clock should get them there. But what would be truly optimal is the yellow zone represented by 2 o’clock, which represents constant struggle: incremental challenges and skill-levels that are matching these changes over time.

Such a person is a worthy adversary and worthy of the greatest honors. Athletes and sportsmen are the embodiment of this credo in modern life. They represent the healthiest, most sustainable, and dare I say, most meaningful approach to life [18].

### 626. twoway

```stata
import delimited https://raw.githubusercontent.com/jhustata/livre/main/transplants.txt, clear
sum age
g age_c = age - r(mean)
sum bmi 
g bmi_c = bmi - r(mean)
regress bmi_c age_c 
matrix define m = e(b)

local b1: di %3.2f m[1,1]
local b0: di %3.2f m[1,2]

gen bmi_hat = `b0' + `b1' * age_c 
sum age_c, d
local q1: di %3.2f r(p5)
local q4: di %3.2f r(p75)

local rmse = e(rmse)
g ci_lower = `b0' - 1.96 * `rmse'
g ci_upper = `b0' + 1.96 * `rmse'
sum age_c
g age_int = r(mean)

twoway ///
   (scatter bmi_c age_c, mcolor(lime) msymbol(oh) jitter(5)) ///
   (line bmi_hat age_c, lcolor(blue) lstyle(dash)) ///
   (rcap ci_lower ci_upper age_int, lcolor(red)), ///
       xtitle("Age") ///
	   ytitle("BMI", orientation(horizontal)) ///
       title(" ") ///
       text(0 0 "Intercept = `b0'", size(small)) ///
       text(0 -40 "Slope = `b1'", size(small)) ///
	   xlab(-40 "0" -20 "20" 0 "40" 20 "60" 40 "80") ///
	   ylab(-10 "20" -5 "25" 0 "30" 5 "35" 10 "40") ///
	   legend(off)
graph export twoway_overlayfit.png, replace 

07/04/2023#

627. washington#

the greater Seattle area is best thought of as an elliptical patch of land with a north-south long axis. lake washington is at the center, i5 and i405 form the western and eastern boarders, with the elliot bay (continuous with vancouver & victoria) to the west of i5 and the mountain ranges to the east of i405.

everett is the northern most tip, tacoma the southern most tip, and west-east bridges across lake washington connect from south-to-noth seatac-renton, seattle-bellevue (over mercer island), and university district-redmond.

as such, a northward one-hour $150 uber ride on i405 in either a prius or telsa from from seatac via renton, bellevue, to shohomish is especially picturesque, revealing some prime real estate including bill gates’ marital home.

628. urls#

https://jhustata.github.io/book/_downloads/34a8255f06036b44354b3c36c5583d7e/transplants.dta https://jhustata.github.io/livre/_downloads/34a8255f06036b44354b3c36c5583d7e/transplants.dta
https://raw.githubusercontent.com/jhustata/livre/main/transplants.txt

https://jhustata.github.io/book/_downloads/1dbcfe5183401a542fa4f336f624de75/donors.dta https://jhustata.github.io/livre/_downloads/1dbcfe5183401a542fa4f336f624de75/donors.dta https://raw.githubusercontent.com/jhustata/livre/main/donors.txt

https://jhustata.github.io/book/_downloads/f246ffb8c3fc8aad08fcc8b4d75aa2fb/donors_recipients.dta https://jhustata.github.io/livre/_downloads/f246ffb8c3fc8aad08fcc8b4d75aa2fb/donors_recipients.dta
https://raw.githubusercontent.com/jhustata/livre/main/donors_recipients.txt

https://raw.githubusercontent.com/jhustata/livre/main/tx_yr.txt
https://jhustata.github.io/livre/_downloads/27a576323cb92ad0ff242e3ea525dbf8/tx_yr.dta https://jhustata.github.io/book/_downloads/884b9e06eb29f89b1b87da4eab39775d/hw1.txt
https://raw.githubusercontent.com/jhustata/book/main/1-first gen antihisitamines.xlsx

https://jhustata.github.io/book/_downloads/f5c23ae99afdaa6ef642721ce0b9ca5f/hw2_pra_hist.dta https://jhustata.github.io/book/_downloads/5f5f5b6938a482eb2550870b996f2ea7/hw2_hosp.dta

use url.dta, clear
import delimited url.txt, clear
import excel url.xlsx, clear

629. prime#

cls
capture program drop prime
program define prime 
    capture syntax, n(numlist)
    if _rc != 0 {

		local rc1 "Let's talk about"
		local rc2 "real numbers"
		local rc3 "natural numbers"
		di "`rc1' `rc2', `rc3', etc."

		
	}
	else if `n' <= 1 {
		di "`n' is not a natural number greater than 1"
	}
	else if mod(`n' - 1, 2) - round(mod(`n' - 1, 2)) != 0 {
		di "`n' is a fraction"
	}
	else if mod(`n', 2) == 0 {
		di "`n' is an even number " 
	} 
	else if mod(`n' - 1, 2) == 0 {
		di "`n' is a prime number " 
	}
   else {
        di in red "undocumented category"
   }
end 


foreach num in pwd -5 100 109 1 `c(pi)' . 999 {
	prime, n(`num')
}

codify non-prime odd numbers later

630. simulate#

capture program drop _all

program define sampmean

    syntax, n(numlist) mean(numlist) sd(numlist)

    local id = 1

    foreach num in `n' {
        clear
        set obs `num'
        set seed 2
        local mean = `mean'
        gen x = rnormal(`mean',`sd')
        gen id = `id'
        save dat_`num', replace
        local id = `id' + 1
    }

    clear

    foreach num in 5 20 100 1000 {
        append using dat_`num'
    }

    forvalues i = 1/4 {
        sum x if id == `i'
        local mean`i': di %3.2f r(mean)
    }

    local ytext = `mean' + 3

    #delimit ;
    twoway 
        (scatter x i, 
            mcolor(midblue) 
            jitter(5) 
            msize(.2) 
            xlab(1 "5" 2 "20" 3 "100" 4 "1000") 
            xscale(range(0 5)) 
            xti("") 
            text(`ytext' 1 "Mean=`mean1'") 
            text(`ytext' 2 "Mean=`mean2'") 
            text(`ytext' 3 "Mean=`mean3'") 
            text(`ytext' 4 "Mean=`mean4'") 
            legend(off)
        ) 
        (lowess x i,
            lwidth(1.2)
        )
    ;
    #delimit cr

end

qui sampmean, n(5 20 100 1000) mean(20) sd(1.2)

631. xlab 🚧#

qui local numlist 5 20 100 1000
local n: di wordcount("`numlist'")
forvalues i=1/`n' {
	local xval: di word("`numlist'", `i')
	local xlab: di  `"`xlab' `i' "`xval'""'
}
di `"`xlab'"'

632. tw#

capture program drop _all

program define sampmean

    syntax, n(numlist) mean(numlist) sd(numlist)

    local id = 1

    foreach num in `n' {
        clear
        set obs `num'
        set seed 2
        local mean = `mean'
        gen x = rnormal(`mean',`sd')
        gen id = `id'
        save dat_`num', replace
        local id = `id' + 1
    }

    clear

    foreach num in `n' {
        append using dat_`num'
    }

    forvalues i = 1/4 {
        sum x if id == `i'
        local mean`i': di %3.2f r(mean)
    }

    local ytext = `mean' + 3
	
	
    qui local numlist `n'
    local n: di wordcount("`numlist'")
    forvalues i=1/`n' {
	    local xval: di word("`numlist'", `i')
	    local xlab: di  `"`xlab' `i' "`xval'""'
    }
    di `"`xlab'"'


	local tw: di `"twoway (scatter x i, mcolor(midblue) jitter(5) msize(.2)  xscale(range(0 5)) xti("") text(`ytext' 1 "Mean=`mean1'") text(`ytext' 2 "Mean=`mean2'")  text(`ytext' 3 "Mean=`mean3'") text(`ytext' 4 "Mean=`mean4'") legend(off)) (lowess x i,lwidth(1.2))"' 
    `tw'
end

qui sampmean, n(5 20 100 1000) mean(20) sd(1.2)

07/05/2023#

633. uncertainty#

\( Y_i = \beta_0 + \beta_1 X_1 + \beta_2 X_2 + \cdots + \beta_N X_N + \epsilon_{i \sim N (\mu, \sigma)} \)

sources:

\(Y_i, X_i\) ascertainment error (design)

  • beyond biological variation

  • calibration error

  • rounding error

  • duration of followup and censoring

\(\beta_i\) estimation error (analysis)

  • number of covariates included in models

  • variance inflation factors

  • missingness of included covariates

  • handling of missingness of covariates

  • sample size of complete dataset

\(\epsilon_i\) model error (science)

  • outcome distribution

    • model specification

      • linear regression for continuous

      • logistic for binary

      • mlogit for categorical with no specific relationship among outcomes

      • ologit for categorical with ordinal relationship among outcomes

      • exponential for time-to-event in absence of proportional hazards

      • cox for time-to-event with proportional hazards

      • poisson for time-to-event with constant hazards

  • covariate functional form

    • continuous

      • mathematical function

      • splines

      • transformation (e.g. log)

    • binary

    • categorical

  • interaction insufficiently modelled

    • human limits (i.e., unsupervised machine learning not utilized)

      • principal component analysis when c(k) > c(N)

      • autoencoders

634. j#

1824 23rd st, snohomish, wa 98290

635. phew!#

  • finally we can hire!

  • RDA 109905

  • afecdvi

636. bach#

Why Bach Moves Us | The New York Review

George B. Stauffer

John Eliot Gardiner’s ‘Bach: Music in the Castle of Heaven’

February 20, 2014 issue

I could not help but marvel at the miracle of hearing this music from the Roman Catholic Latin Mass Ordinary, written by a Lutheran composer in Leipzig, in a Reform Jewish temple in New York City

One of my most moving encounters with the music of Johann Sebastian Bach took place in the spring of 1997 in New York City’s Central Synagogue. I was there to pay last respects to Gabe Wiener, a talented young recording engineer who died of a brain aneurysm at age twenty-six. I had approached Gabe earlier in the year to see if his recording company, PGM Classics, would consider collaborating with the American Bach Society, which I led at the time, to produce a compact disc of previously unrecorded organ music from Bach’s circle. Gabe enthusiastically agreed to the proposal, and together we embarked on a project we called “The Uncommon Bach.” We had just settled on the repertory and the organ when I received word of his death.

There was great lamenting at the memorial service that this talented young man had been snatched away in the midst of important work, with so much promise unfulfilled. The service began with Gabe’s recording of Salamone Rossi’s Hebrew setting of the Songs of Solomon, a gorgeous yet relatively unknown Venetian masterpiece \(\cdots\)

637. wayne#

  • alaska airlines 1184

  • expedia itinerary: 72599861510824

  • ticket number: 0278008590834

  • seattle (gate n4) - orange country (terminal b, gate 12)

  • fri, july 7

  • 6:40pm - 9:28pm (2h 48m duration)

  • tsapre TT129232B (dec 04, 2025)

07/06/2023#

638. pretty-please#

Hide code cell source
from tabulate import tabulate

# Define the table data as a list of lists
data = [
    ["studentuniverse", "", "mozart", "shaKespeare", "2023-06-15"],
    ["alaskan", "340733890", "mozart", "shaKespeare", "2023-02-05"],
    ["frontier", "90102556651", "mozart", "shaKespeare", "2023-07-06"]
]

# Define the table headers
headers = ["account", "number", "username", "password", "date"]

# Generate the table
table = tabulate(data, headers, tablefmt="")

# Print the table
print(table)
account          number       username    password     date
---------------  -----------  ----------  -----------  ----------
studentuniverse               mozart      shaKespeare  2023-06-15
alaskan          340733890    mozart      shaKespeare  2023-02-05
frontier         90102556651  mozart      shaKespeare  2023-07-06
data[0][4]
'2023-06-15'

639. seattle#

  • paramount

  • 911 pine st, seattle, wa 98101

  • confirmation 133704514

  • expedia itinerary 72600817540441

  • thu, jul 6 4pm - fri, jul 7 noon

640. homecoming#

  • mon, jul 10 at 7:56pm

  • orange county -> baltimore

  • trip confirmation HFLJJA

641. tas#

  • spring term 2024

    • xujun gu

    • ning meng

    • mgb

07/07/2023#

642. apple#

  • alex half-life

  • game for apple glasses

  • recommended by ángel

643. fawaz#

  • got qualified analyst to apply for job posting

  • schedule an interview as soon as application received

  • double-check by 07/12/2023 to make sure we’re not lagging behind

644. seattle#

  • i5 & Boren, Convention center & sparghetti

  • 8th & Pine, Paramount Hotel

  • 8th & Blanchard, Amazon

  • 1st/2nd & Blanchard (ie 115 Blanchard), Have a heart

  • Boca

    • Provoleta

    • Empanadas x2

    • Ceviche

  • Bodhizafa

    • Westcoast IPA

    • Georgetown Brewing

645. sound#

  • Two Brazilians

  • Look Italian

  • And sound too

  • But it’s Portuguese

646. thc#

  • Day after 400mg THC

  • Should never exceed 20mg

  • Perhaps always limit to 10mg

  • Would give you 10 days

  • And that is reasonable

\( Y = \beta_0 + \beta_1 X_1 + \cdots + \beta_N X_N + \epsilon_{i \sim N(\mu, \sigma)} \)

\(Y, X\) - sensory
\(\beta_i\)- processor
\(\epsilon_i\)- muscle/bone/agency
\(i\) - sampling, power, big-legs, limits

Auto-encoder

  • Predictions

  • Beta-coefficients \(b[k,1]\)

  • Variance-covariants \(V[k+df,b)\)

  • Regression (phenotype, dynamic), Survey (genotype, fixed)

647. catalog#

  • Python script to compile all ergot papers and create links with title & author

  • Later create page with hyperlinks for all the SRTR code-snippets

  • There’ll be a legacy repo but use that to launch a repo with built-in:

    • version control

    • collaboration in-build

    • gihub co-pilot

648. philosophy#

and education. with regard to regression analysis:

  1. Why are most didactic experiences dedicated to interpreting \(\beta_i\)?

  2. Should there be a threshold of \(\epsilon_i\) below which there’s no point interpreting \(\beta_i\)?

  3. Virtually all clinical trials are based on \(\beta_i\).

    • The very selective sampling makes \(\epsilon_i\) and prediction of \(\hat Y\) quite reasonable

    • But the real-world application will suck: \(cf\) 95% efficacy of COVID mRNA vaccines

    • Let \(\epsilon_i \propto \frac {1}{1 - \beta} \)

649. poland#

  • stefan (Uber driver)

  • $100 for a night in 5-star hotel

  • A dollar goes a long way

  • So be sure to go on vacation

  • was imprisoned and expelled from his home-country in the 70s

  • present regime has compensated him for this transgression

  • meanwhile he has established a life for his family in the us

650. seatac#

  • train to downtown $\(5\)

  • uber is \(75\)

  • seatoc halfway between seattle & tacoma, near tenton

  • if in no mood of spending, use the light rail

651. wayne#

  • most g5’s ever seen in one place

  • also a pretty large commercial airport

  • santa ana, orange country, ca

07/08/2023#

652. five-o’clock#

  • woken me up

  • call to prayer at 4:20

  • it’s a song

653. tameth#

\(Y_i = \hat Y_i + \epsilon_i\)

07/10/2023#

654. verbs#

  • tameth \(1 - \frac{SSR}{SST} \sim 1 - \frac{var(\epsilon_i)}{var(Y_i)}\)

  • whineth \(\frac{SSR}{SST} \sim \frac{var(\epsilon_i)}{var(Y_i)}\)

  • hideth \(\hat Y_i \sim Y_i - \epsilon_i\)

655. amy#

  • she’s a text away from fawaz!

  • didn’t even so much as say bye

  • the deterioration is attributable to mgb

656. hangar24#

  • its 1 mile from john wayne aiport

  • has over 15 local craft beers on tap

  • embedded within a coorporate area

  • ambience, comfort, menu items, etc

  • my rating is 4.5/5; not a wide-range of clientelle

657. soca#

  • a sort of paradise: the weather, beaches, lifestyle

  • how may one live in this environment and achieve flow?

  • that is a question to be gently approached in the coming years?

07/11/2023#

658. b’more#

  • ewns, home is best!

659. auteurs#

  • chaplin

  • woody

  • perry

actors, screenwriters, and directors that demanded and attained control when they became producers

660. contest#

while The Economist was first published in September 1843 to take part in a severe contest between intelligence, which presses forward, and an unworthy, timid ignorance obstructing our progress, the more enduring contest is really between beauty, which needs no explanation, and truth, which has believers and doubters.

aesthetics is the subject focusing on beauty and ethics is the one about truth. it should be noted that while beauty is self-evident and compelling truth and ethics start off with declarations about their self-evident essence, which is outrightly rejected by many.

to speak more specifically, ethics and truth represent conscious or unconscious wishes by the more caring kinds of human beings, who wish to protect vulnerable lambs from wolves. aesthetics and beauty by contrast simply represent effects that inspire awe, wonder, and life.

so why are these two at war? can’t honest neighbors be friends? unfortunately, truth and beauty are mutually exclusive. a beautiful face might have launched a thousand war ships, but in general beauty doesn’t inspire war. it is arguments and disagreements about the truth that account for virtually all wars and conflicts in history.

beauty is never contested. even the trojan war wasn’t a contest about beauty, it was a contest for an acclaimed beauty. and often beauty is portayed as lacking in agency, vulnerable, in need of rescue. and here is precisely where truth and war creep in.

for who is the truest and worthiest knight in shinning armour? beauty always goes with the winner of the duel, battle, or war. dead shepherd, now i see thy saw of might: carefree, mocking, unconcerned, scornful, coercive, violent —so wisdom wisheth us; she is a woman, and ever loveth only a warrior.

661. understood#

why don’t i have the urge to correct a wrong impression one may have of me?
more generally, why are folks so afraid of being misunderstood? is there really anything at stake or is it just ego?

662. wilde#

Beauty is a form of Genius–is higher, indeed, than Genius, as it needs no explanation. It is one of the great facts of the world, like sunlight, or springtime, or the reflection in the dark waters of that silver shell we call the moon. It cannot be questioned

663. actions#

a solder or warrior is never afraid of being misunderstood!

664. verbs#

  • tameth to do acclaimed stuff and take the spoils [50]

  • whineth to be misunderstood (please here me out!)

  • hideth to lie in the shadows (thereby avoiding evil)[37]

07/12/2023#

665. coping#

  • illusions. the ideas that inspite of, and perhaps because of, their inaccuracy enable healthy coping with the human condition

  • delusions. ideas that interfere with our ability to relate with fellow man and to function socially

  • artists. creators of the most compelling illusions that have sustained motivation and the will-to-life in the delicate creature, man

  • scientists. that subgroup of mankind that didn’t get the memo on the absolute necessity of illusions for the general and specific wellness of man

  • philosophers. virtually every man and woman; more’s the pity that most haven’t bothered to study their forebears!

666. acting#

isn’t it upsetting when your hero, who always seemed to be in control of things, turns out to be merely an oustanding actor? beware of meeting your hero in person: comes with a huge risk and might precipitate cognitive dissonance at best and an existential crisis at worst.

for some their hero is the scientific enterprise. and so they dedicate a lifetime as participants, gladiators if you will, in the arena. it is here that they become acquainted with their hero and, alas, the veil of illusion is lifted.

art should celebrate life in the midst of mystery, without any inkling of despair or the illusions sought by science.

667. jhu#

  • president ron daniels

  • his vision for jhu by 2020

Johns Hopkins is a resilient, mission-driven place that refuses to wait for the ideal conditions, but instead seizes upon all opportunity for brave innovation. This is a core feature of our institutional DNA.

668. catchphrase#

  • my cup runneth over

  • ekikompe kyange kiyiika

  • la mia tazza straripa

669. orangecounty#

  • the clear quran

  • a thematic translation

  • dr. mustafa khattab

My dear brother Abi,

I wish you the best in this life and the hereafter

Fawaz

Please Note: The Quran, sometimes written as “Koran,” is a sacred book of Mulsims, who believe that God has revealed His word and message to humanity throughout the ages. For example, we believe that Moses was given the Torah, David was given the Psalms, and Jesus was given the Gospel. In the same way, we believe that Muhammad was given the Quran. Muslims therefore treat the holy books of other faiths with greate care and respect. For instance, we do not read these books in the bathroom or place them on the floor. It is our request that you treat this book with the same respect and handle it as you would your own scripture. If you do not want this copy, please do not throw it away. You may give it to any Muslim you know or your nearest Islamic center. You may also call us at 1-888-273-2754, and we will arrange to pick it up.

670. enfrançais#

Yes, there is a link between the French words “voir,” “savoir,” and “pouvoir.” These three words are part of a group of verbs known as the “irregular -oir verbs.” While they are not conjugated in the same way as regular -ir verbs, they do share some similarities.

The common link between these three verbs is their infinitive endings: -oir. This ending is characteristic of a group of verbs in French, and it often indicates the verb’s meaning or function.

Here’s a breakdown of each verb and its meaning:

  1. “Voir” - This verb means “to see.” It is used to express the act of perceiving something visually. For example:

    • Je vois un oiseau. (I see a bird.)

    • Est-ce que tu vois la mer? (Do you see the sea?)

  2. “Savoir” - This verb means “to know.” It is used to express knowledge or awareness of information or facts. For example:

    • Je sais parler français. (I know how to speak French.)

    • Savez-vous où se trouve la bibliothèque? (Do you know where the library is?)

  3. “Pouvoir” - This verb means “to be able to” or “can.” It is used to express capability or possibility. For example:

    • Je peux te aider. (I can help you.)

    • Peux-tu venir demain soir? (Can you come tomorrow evening?)

While these verbs share the -oir ending and are irregular in their conjugation, they do not have any inherent etymological or semantic connection beyond being part of the same verb group.

671. verbs#

  • pouvoir tameth to do

  • savoir whineth to be

  • voir hideth to no

The phrase commonly used to refer to “see no evil, hear no evil, speak no evil” is “les trois singes de la sagesse” in French. Literally translated, it means “the three monkeys of wisdom.” Each monkey represents a different principle: “ne rien voir de mal” (see no evil), “ne rien entendre de mal” (hear no evil), and “ne rien dire de mal” (speak no evil).

Certainly! The variant “do no evil” can be translated as “ne rien faire de mal” in French. It complements the traditional phrase “see no evil, hear no evil, speak no evil” with the additional principle of not doing any evil actions.

672. belgians#

  • orval 6.2% 3:45PM - 5:20PM

  • westmalle dubbel 7.0% 5:20PM - 8:37PM

  • duvel 8.5% 8:37PM - 11:04PM

  • westmalle tripel 9.5% 00:24PM - 02:01AM

  • st. bernardus abt 12 10.0% 04:18AM - 06:02AM

673. clever#

  • nutrition

  • environment

  • hygiene

There is a midway condition which a man of such a destiny will not be able to recall without emotion: it is characterized by a pale, subtle happiness of light and sunshine, a feeling of bird-like freedom, bird-like altitude, bird-like exuberance, and a third thing in which curiosity is united with a tender contempt. A free-spirit, this cool expression, does one good in every condition, it’s almost warming. One lives no longer in the fetters of love and hatred, without yes, without no, near or far as one wishes, preferably slipping away, evading, flattering off, gone again, again flying aloft; one is spoilt, as everyone is who has at sometime seen a tremendous number of things beneath him — and one becomes the opposite of those who concern themselves with things that have nothing to do with them. Indeed, the free-spirit hence forth has only to do with things — and how many things! — with which he is no longer concerned...     

from January 14, 2020 at 3:15AM + 8hrs

Is there any culture in human history that treats each cohort of newborns and over a lifetime as sui generis? Of course that would be impracticle; however, at a bare minimum a great culture should at every year from cradle-to-grave do the following:

  1. present a hierachy of challenges for each year

  2. train the cohort so that they might develop skills to match the challenge

  3. weekly peer-driven feedback system to align appropriate challenge-levels with skill-levels for all in the cohort

07/13/2023#

674. today#

  • simeon

  • homerun

  • journaling

675. friends#

  • develop a system of staying in touch with the wider network of friends

  • with our aging parents there is just a lot going on for everyone

  • at times just the humble reassurance about diagnosis, prognosis, and expectations is helpful

  • some unsolicited advice about avoiding hospitalization at all costs will be offered

  • but just simply play role of care coordinator, a neglected role across the globe

676. three-peat#

 soul, t 
     \
      mind, x --> verb, n --> change, o  
     /
     body, s

677. cluster#

definitive guide to accessing the cluster

07/14/2023#

678. learning#

                                          3. parametric
                                                       \ 
                       —> 0. humans  —> 1. twoway       4. model —> 5. data -> 6. likelihood -> 7. inference
                                                        / 
                                                        2. nonparametric

      




                                          3e.ἄσκησις,μ
                                                       \ 
                —> 0g.κοσμογονία,γ  —> 1f.ἡἔρις,κ       4d.∫δυσφορία.dt,ψ —> 5c.φάρμακον,δ -> 6b.ομορφιά,β -> 7a.τάξη,α 
                                                        / 
                                                        2e.πρᾶξις,σ    
                                                        



                                                        
                                    3. supervised
                                                  \ 
 —> 0. machines  —> 1. networks: adversarial       4. pca —> 5. autoencorder -> 6. deep -> 7. adaptive
                                                  / 
                                                  2. unsupervised                                                                                                 

679. publishing#

  1. markup your document and save it as .txt, .do, .md, or .do

<<dd_version: 2>> 

'''
<<dd_do>>
webuse lifeexp, clear 
encode country, gen(Country)
twoway scatter lexp Country, xscale(off)
graph export lexp_bycountry.png, replace 
qui sum lexp 
<</dd_do>>
'''

<<dd_graph>>

Here's a summary of the data: The total number of countries is
<<dd_display: r(N)>>, mean life expectancy is <<dd_display: r(mean)>>, and the
standard deviation is  <<dd_display: r(sd)>>
  1. type the following command into the stata command window:

dyndoc publish.do, saving(publish.html)
  1. inspect your output in the stata monitor

. dyndoc publish.do, saving(publish.html)
. webuse lifeexp, clear 
(Life expectancy, 1998)

. encode country, gen(Country)

. twoway scatter lexp Country, xscale(off)

. graph export lexp_bycountry.png, replace 
file /Users/d/Desktop/lexp_bycountry.png saved as PNG format

. qui sum lexp 

. 

successfully converted "/Users/d/Desktop/publish.html"

. 
  1. now inspect what is outputed to the workfolder

ls -l
. ls -l

total 456
-rw-r--r-		1	d	staff	59500	Jul	14	03:37	Graph.svg
-rw-r--r-		1	d	staff	59500	Jul	14	03:38	Graph1.svg
-rw-r--r-		1	d	staff	96154	Jul	14	03:38	lexp_bycountry.png
-rw-r--r-	@	1	d	staff	401	Jul	14	03:37	publish.do
-rw-r--r-	@	1	d	staff	461	Jul	14	03:38	publish.html
-rw------		1	d	staff	402	Jul	14	03:37	~publish.do.stswp

. 
  1. what is publishing?

    • handing in homework as .pdf or .html which unlike a .log file can have text and output intermingled and with code and output as options

    • sharing in-house exploratory data analyses with collaborators

    • conference abstract

    • original research submitted to a peer-review journal

    • brochures by a pharmaceutical company describing results from a clinical trial of their latest drug

    • etc.

Hide code cell source
import networkx as nx
import matplotlib.pyplot as plt

G = nx.DiGraph()
G.add_node("user", pos=(550, 500))
G.add_node("system", pos=(-550, 500))
G.add_node("program", pos=(-2000, 960))
G.add_node("syntax", pos=(2000, 950))
G.add_node("ado", pos=(-3000, 550))
G.add_node("do", pos=(3000, 550))
G.add_node("command", pos=(-1900, 150))
G.add_node("queue", pos=(1900, 150))
G.add_node("results", pos=(0, 0))
G.add_node("dta", pos=(0, -475))
G.add_node("log", pos=(-1900, -475))
G.add_node("excel", pos=(-4000, -475))
G.add_node("word", pos=(1900, -475))
G.add_node("html", pos=(4000, -475))
G.add_node("publish", pos=(0, -950))

G.add_edges_from([("program", "ado"), ("syntax", "do")])
G.add_edges_from([("ado", "command"), ("do", "queue")])
G.add_edges_from([("command", "results"), ("queue", "results"), ("results", "excel"), ("results", "word"), ("results", "html")])
G.add_edges_from([("results", "dta"), ("results", "log")])
G.add_edges_from([("dta", "publish"), ("log", "publish"), ("excel", "publish"), ("word", "publish"), ("html", "publish")])

pos = nx.get_node_attributes(G, 'pos')
labels = {"user": "user",
          "system": "system",
          "program": "program",
          "syntax": "syntax",
          "ado": "ado",
          "do": "do",
          "command": "command",
          "queue": "queue",
          "results": "results",
          "dta": ".dta",
          "log": ".log",
          "excel": ".xlsx",
          "word": ".docx",
          "html": ".html",
          "publish": "publish"}

plt.figure(figsize=(8, 8))
nx.draw(G, pos, with_labels=False, node_size=3000, node_color="lightblue", linewidths=3)
nx.draw_networkx_labels(G, pos, labels, font_size=10, font_weight='bold')
plt.xlim(-5000, 5000)
plt.ylim(-1000, 1000)
plt.axis("off")
plt.show()
../_images/4402a1326680ab076b45989b087b07ab558a460d54f86c9c26dbf5df419526fb.png
  1. do the pat hack

cp Graph.svg _build/html/Graph1.svg
cp publish.html _build/html/publish.html
  1. voila!

680. time#

  1. active

  2. magic

  3. passive

Let’s explore the contrast between active and passive processes, which occur over time, and miracles, which are not bound by cause-effect relationships or the dimension of time.

680.1 active#

Active processes involve a series of intentional and directed actions that take place over a period of time. They are characterized by cause-and-effect relationships and a sense of agency or deliberate effort. Here are some key attributes of active processes:

  1. Intentionality: Active processes are initiated and driven by conscious decisions or actions.

  2. Cause and Effect: They follow a logical sequence of steps, where each action has a direct impact on the subsequent outcome.

  3. Time-bound: Active processes unfold over a specific duration, with a beginning, middle, and end.

  4. Human Agency: They involve human intervention, decision-making, and effort to achieve a desired outcome.

  5. Measurability: Active processes can often be measured or assessed based on predefined metrics or milestones.

Examples of active processes include project management, problem-solving methodologies, skill development through practice and training, or any task that requires deliberate action and progresses over time. [18]

680.2 passive#

Passive processes, also known as natural or inherent processes, occur over time without conscious or deliberate human intervention. They are driven by natural laws, environmental factors, or inherent properties of a system. Here are some key attributes of passive processes:

  1. Natural Laws: Passive processes operate according to predetermined rules or laws of nature.

  2. Cause and Effect (Indirect): While passive processes still exhibit cause-and-effect relationships, the cause is often indirect and less influenced by human agency.

  3. Time-bound: Passive processes also unfold over time, but their progression may be slower or less perceivable.

  4. Environmental Factors: They are influenced by external conditions, such as climate, geology, or biological systems.

  5. Self-Organization: Passive processes often involve self-regulating or self-organizing systems that maintain stability or equilibrium.

Examples of passive processes include erosion, weathering, ecological succession, natural healing or recovery processes, or the growth of plants in response to environmental factors. While these processes can be influenced by external factors, they primarily occur without conscious human intervention.

680.3 miracle#

Miracles, as mentioned earlier, are extraordinary events or phenomena that defy the laws of nature and cause-effect relationships. Here are the contrasting aspects between active/passive processes and miracles:

  1. Cause and Effect: Active and passive processes are governed by cause and effect, where specific actions or natural laws drive the progression of events over time. In contrast, miracles are seen as exceptions to cause-effect relationships, often defying rational explanation.

  2. Time Dimension: Active and passive processes occur over time, with a sense of progression, while miracles are not bound by the dimension of time. Miracles are considered spontaneous, exceptional events that transcend the normal flow of cause-effect relationships.

  3. Human Agency: Active processes involve deliberate human intervention and effort, whereas passive processes occur naturally without direct human influence. In contrast, miracles are often attributed to supernatural or divine forces beyond human control.

  4. Predictability: Active and passive processes are generally predictable to some extent, with expected outcomes based on known factors or laws. Miracles, however, are viewed as unpredictable and extraordinary, defying expectations and logical explanations.

In summary, active and passive processes involve cause-effect relationships and occur over time, with active processes driven by deliberate human action and passive processes occurring naturally. Miracles, on the other hand, transcend cause-effect relationships, defy rational explanation, and are not bound by the dimension of time.

681. joshua#

  • new resident portal as of 07/01

  • password: shaKespeare

  • but use app

682. yay!#

683. chordstack#

  • Professor Vaughn Brathwaite - The Gospel University

  • His Piano system will take you to another level

  • Berklee College of Music

  • I’ve signed up & should remain committed for 10y

684. gratitude 🙏#

  • andrew

  • vincent

  • fawaz

07/15/2023#

685. flow#

i guess i’ve been in flow this past week, since returning from the west coast

686. newlove#

  • victoria monét is a dionysian girl per excellence

  • her art form is beyond good & evil: even beyoncé can’t say that

  • but its really her reverence for tradition that wins me

  • and yet its not mere ancestral worship: she draws blood at other times

  • smoke is probably the best proof of this assertion

687. r&b#

  • her - gotta move on

  • sir - nothing even matters

689. tonibraxton#

sui generis

690. ye#

father stretch my hands, pt. 1

691. stata#

  • grading heuristic

  • c - visualize observed data (enhance basic stuff from epi)

  • b - compare observed to expected (r-class: o/e test-statistics, e(N)/r(N))

  • a - decision about quality or meaning of data (p-values, credibility intervals)

  • timeline

    1. oneway

    2. twoway

    3. multivariable

    4. hierarchical

    5. r, python, & ai

692. tutor#

Hey Abimereki, it's Vaughn J Brathwaite from Gospel University. Our ZOOM call is on Jul 15 @ 04:00 PM . Here's the ZOOM link: https://us06web.zoom.us/j/88924354455?pwd=VDhvd21LTDlwTUF2WHB4NGpUVGg2dz09 . Also, Vaughn wants you to watch this video before our Zoom call - join.gospelu.com/pre-call-hw
 Sent by OnceHub.

693. veranolibro#

  • start experimenting

  • for the 340.600

  • have #691 above in mind

694. annuŋŋamya#

  • continue building variations on the original theme

  • recruit the chord-stack system

  • and lets see how that workrs

695. organ#

  • gospel university

  • online gospel music academy

  • vaughn (pronounced von) brathwaite

  • berklee college of music

  • chord stack system

Your upcoming appointment is starting in 1 hour

Dear Abimereki,
Your appointment is starting in approximately 1 hour. Please see below for more information.

Booking details
Subject
Chord Stack System - Consultation
Booking page
Vaughn Brathwaite
Time
Sat, Jul 15, 2023, 04:00 PM - 05:00 PM
United States; Eastern time (GMT-4:00) [DST] 
Cancel/Reschedule
Additional guests
Conferencing information
When it's time, join the meeting from PC, Mac, Linux, iOS or Android: 
https://us06web.zoom.us/j/88924354455?pwd=VDhvd21LTDlwTUF2WHB4NGpUVGg2dz09
Passcode: piano

Access Information:
Where: Zoom
Meeting ID: 88924354455
Meeting passcode: piano
Booking ID
BKNG-AFCY906BGZHP
  • consulation session

  • united credit (855) 503 1800 of irvine, california

  • please upload photo id (did passport)

  • lending point aggreement signed

Hi Abimereki,
Please sign this Non-Disclosure agreement (which confirms that you will NOT share any information from the GMC program with the public) to continue your enrollment process for the Gospel Musician Coaching program.
After this document has been completed, you will receive instructions on how to setup and access ALL the materials within the Gospel Musician Coaching program.
Congratulations on joining the program!
Cheers,
Customer Success Team - Gospel University
Hi Abimereki,
I'm Lorenzo and I'll be your piano coach and mentor over the next 6 months!
I'm excited to meet you and get started. 
Here is the link where you can book your onboarding call with me - CLICK HERE
On our first meeting we'll go through your playing video that you submitted and I'll give you some feedback and guidance on where to go from there.
Looking forward to meeting you!
- Lorenzo
The Gospel University, 1545 VFW PKWY, West Roxbury, MA 02132, United States, 8888743930
Unsubscribe Manage preferences

CLICK HERE


Hey Abimereki,
 
Welcome to Gospel Musician Coaching!
 
I’m looking forward to our journey together.
 
Follow these IMPORTANT Steps:

1. Find your Login Link Instructions (this was sent to you in a separate email), then....
 
2. Login here - https://academy.thegospeluniversity.com/users/sign_in
 
 
3. Complete your ONBOARDING process here - CLICK HERE
 
It'll walk you through the entire Gospel University learning platform and show you where everything is.
 
 
4. Make sure you introduce yourself in the GMC Student Community - CLICK HERE


OTHER IMPORTANT STUFF:
 
REMEMBER! Do NOT share any links or any information provided within the GMC program with anyone OUTSIDE the program. Doing so VOIDS your signed agreement with us.
 
Welcome again, and if you have any questions feel free to email us at: support@thegospeluniversity.com
 
Welcome to the GMC family.
- Vaughn
	
Your upcoming appointment is starting in 10 minutes

Dear Abimereki,
Your appointment is starting in approximately 10 minutes. Please see below for more information.
 	 	 
 	 
 	 
Booking details
Subject
Chord Stack System - Consultation
Booking page
Vaughn Brathwaite
Time
Sat, Jul 15, 2023, 02:30 PM - 03:30 PM
United States; Eastern time (GMT-4:00) [DST] 
Cancel/Reschedule
Additional guests
Conferencing information
When it's time, join the meeting from PC, Mac, Linux, iOS or Android: 
https://us06web.zoom.us/j/88924354455?pwd=VDhvd21LTDlwTUF2WHB4NGpUVGg2dz09
Passcode: piano

Access Information:
Where: Zoom
Meeting ID: 88924354455
Meeting passcode: piano
Booking ID
BKNG-AFCY906BGZHP

696. stata#

  • i unconsciously seperated the conceptual (sessions 1-5) from the technical (hub notes 1-27)

  • but lets formalize this and capitalize on it to advance our cause even more efficiently

  • should help develop the grading rubric so that weights are meaningfully assigned

697. recap#

fingers = voices
so.. voicing
confused me for a min
of course musics origins
are entirely vocal
a capella sistina
for worship & praise
well my voicing is
going to advance
from octave to more 
nuanced in left hand

698. zone#

been in it lately!

699. migration#

~/dropbox/1f.ἡἔρις,κ/1.ontology
  • ai, python, r, and stata workflows

  • careful deployment of naming conventions

  • jupyter-book folders will be seasons

  • github-repo clones will be books

  • lets compile the language variants

  • will be able to have offline large datasets

  • in the vicinity requiring only a ../dataset.csv

  • while importing them via jupyter script

700. chemistry#

  • musical prodigees are immersed in chemistry

  • perhaps unique and innate brain chemistry

  • but often also of the self-medicated variety

  • these folks can’t explain their process

  • and often their flickering lights

  • in the windy & tempestuous life we live

  • are blown out while the candle is still tall

  • enter vaughn brathwaite with his

  • chord stack system of the gospel university

701. ancient&modern#

the choir of belfast cathedral

1 
There’s a wideness in God’s mercy,
like the wideness of the sea.
There’s a kindness in God’s justice,
which is more than liberty.

2 
There is welcome for the sinner,
and more graces for the good.
There is mercy with the Savior,
there is healing in his blood.

3 
But we make God’s love too narrow
by false limits of our own,
and we magnify its strictness
with a zeal God will not own.

4 
For the love of God is broader
than the measures of the mind,
and the heart of the Eternal
is most wonderfully kind.

5 
If our love were but more simple,
we should rest upon God’s word,
and our lives would be illumined
by the presence of our Lord

702. conventions#

Yours truly: Produce code that generates a table with 6 columns (Language, Season 1, Season 2, Season 3, Season 4, Book). Row 1 will have the following values for the respective cells: English, Spring, Summer, Fall, Winter, Book. Now, please, fill out additional rows with translations for the following languages: Spanish, French, German, Italian, portuguese, swedish, Norwegian, icelandish, danish, dutch, latin, Russian, Chinese, Korean, Japanese, Vietnamese, arabic, Hindi, Gujarati, Zulu, Yoruba, Luganda, Swahili, etc. please use only the common roman alphabet or variants that are easy to use on the common keyboard

ChatGPT: Certainly! Below is the code that generates the table with 6 columns (Language, Season 1, Season 2, Season 3, Season 4, Book) and fills out additional rows with translations for the specified languages:

Hide code cell source
import pandas as pd

data = [
    ["English", "Spring", "Summer", "Fall", "Winter", "Book"],
    ["Spanish", "Primavera", "Verano", "Otoño", "Invierno", "Libro"],
    ["French", "Printemps", "Été", "Automne", "Hiver", "Livre"],
    ["German", "Frühling", "Sommer", "Herbst", "Winter", "Buch"],
    ["Italian", "Primavera", "Estate", "Autunno", "Inverno", "Libro"],
    ["Portuguese", "Primavera", "Verão", "Outono", "Inverno", "Livro"],
    ["Swedish", "Vår", "Sommar", "Höst", "Vinter", "Bok"],
    ["Norwegian", "Vår", "Sommer", "Høst", "Vinter", "Bok"],
    ["Icelandish", "Vor", "Sumar", "Haust", "Vetur", "Bók"],
    ["Danish", "Forår", "Sommer", "Efterår", "Vinter", "Bog"],
    ["Dutch", "Lente", "Zomer", "Herfst", "Winter", "Boek"],
    ["Latin", "Ver", "Aestas", "Autumnus", "Hiems", "Liber"],
    ["Russian", "Весна", "Лето", "Осень", "Зима", "Книга"],
    ["Chinese", "春季", "夏季", "秋季", "冬季", "书"],
    ["Korean", "봄", "여름", "가을", "겨울", "책"],
    ["Japanese", "春", "夏", "秋", "冬", "本"],
    ["Vietnamese", "Mùa Xuân", "Mùa Hè", "Mùa Thu", "Mùa Đông", "Sách"],
    ["Arabic", "ربيع", "صيف", "خريف", "شتاء", "كتاب"],
    ["Hindi", "वसंत", "ग्रीष्म", "शरद्", "शीत", "पुस्तक"],
    ["Gujarati", "વસંત", "ગ્રીષ્મ", "પતઝડ", "શિયાળ", "પુસ્તક"],
    ["Zulu", "Lentha", "Inhlaba", "Inyanga", "Injini", "Incwadi"],
    ["Yoruba", "Ọsan", "Ètì", "Ọla", "Harmattan", "Ìwe"],
    ["Luganda", "Omukaka", "Kyampa", "Kwakya", "Ssekidde", "Empaako"],
    ["Swahili", "Masika", "Vuli", "Kipupwe", "Kiangazi", "Kitabu"],
    # Add more languages and translations here
]

df = pd.DataFrame(data, columns=["Language", "Season 1", "Season 2", "Season 3", "Season 4", "Book"])
print(df)
      Language   Season 1 Season 2  Season 3   Season 4     Book
0      English     Spring   Summer      Fall     Winter     Book
1      Spanish  Primavera   Verano     Otoño   Invierno    Libro
2       French  Printemps      Été   Automne      Hiver    Livre
3       German   Frühling   Sommer    Herbst     Winter     Buch
4      Italian  Primavera   Estate   Autunno    Inverno    Libro
5   Portuguese  Primavera    Verão    Outono    Inverno    Livro
6      Swedish        Vår   Sommar      Höst     Vinter      Bok
7    Norwegian        Vår   Sommer      Høst     Vinter      Bok
8   Icelandish        Vor    Sumar     Haust      Vetur      Bók
9       Danish      Forår   Sommer   Efterår     Vinter      Bog
10       Dutch      Lente    Zomer    Herfst     Winter     Boek
11       Latin        Ver   Aestas  Autumnus      Hiems    Liber
12     Russian      Весна     Лето     Осень       Зима    Книга
13     Chinese         春季       夏季        秋季         冬季        书
14      Korean          봄       여름        가을         겨울        책
15    Japanese          春        夏         秋          冬        本
16  Vietnamese   Mùa Xuân   Mùa Hè   Mùa Thu   Mùa Đông     Sách
17      Arabic       ربيع      صيف      خريف       شتاء     كتاب
18       Hindi       वसंत  ग्रीष्म      शरद्        शीत   पुस्तक
19    Gujarati       વસંત  ગ્રીષ્મ      પતઝડ      શિયાળ   પુસ્તક
20        Zulu     Lentha  Inhlaba   Inyanga     Injini  Incwadi
21      Yoruba       Ọsan      Ètì       Ọla  Harmattan      Ìwe
22     Luganda    Omukaka   Kyampa    Kwakya   Ssekidde  Empaako
23     Swahili     Masika     Vuli   Kipupwe   Kiangazi   Kitabu

The provided code will create a DataFrame with the translations for the specified languages. You can extend the list of languages and their translations as needed by following the same pattern used in the code.

Hide code cell source
import pandas as pd

data = [
    ["English", "Open", "Science"],
    ["Spanish", "Abierto", "Ciencia"],
    ["French", "Ouvert", "Science"],
    ["German", "Offen", "Wissenschaft"],
    ["Italian", "Aperto", "Scienza"],
    ["Portuguese", "Aberto", "Ciência"],
    ["Swedish", "Öppen", "Vetenskap"],
    ["Norwegian", "Åpen", "Vitenskap"],
    ["Icelandish", "Opin", "Vísindi"],
    ["Danish", "Åben", "Videnskab"],
    ["Dutch", "Open", "Wetenschap"],
    ["Latin", "Apertus", "Scientia"],
    ["Russian", "Открыто", "Наука"],
    ["Chinese", "开放", "科学"],
    ["Korean", "열린", "과학"],
    ["Japanese", "オープン", "科学"],
    ["Vietnamese", "Mở", "Khoa học"],
    ["Arabic", "مفتوح", "علم"],
    ["Hindi", "खुला", "विज्ञान"],
    ["Gujarati", "ખોલ્યું", "વિજ્ઞાન"],
    ["Zulu", "Okuvulwe", "Ulwazi"],
    ["Yoruba", "Ọya", "Ẹkọ"],
    ["Luganda", "Ebisolo", "Ebitonde"],
    ["Swahili", "Wazi", "Sayansi"],
    # Add more languages and translations here
]

df = pd.DataFrame(data, columns=["Language", "Open", "Science"])
print(df)
      Language      Open       Science
0      English      Open       Science
1      Spanish   Abierto       Ciencia
2       French    Ouvert       Science
3       German     Offen  Wissenschaft
4      Italian    Aperto       Scienza
5   Portuguese    Aberto       Ciência
6      Swedish     Öppen     Vetenskap
7    Norwegian      Åpen     Vitenskap
8   Icelandish      Opin       Vísindi
9       Danish      Åben     Videnskab
10       Dutch      Open    Wetenschap
11       Latin   Apertus      Scientia
12     Russian   Открыто         Наука
13     Chinese        开放            科学
14      Korean        열린            과학
15    Japanese      オープン            科学
16  Vietnamese        Mở      Khoa học
17      Arabic     مفتوح           علم
18       Hindi      खुला       विज्ञान
19    Gujarati   ખોલ્યું       વિજ્ઞાન
20        Zulu  Okuvulwe        Ulwazi
21      Yoruba       Ọya           Ẹkọ
22     Luganda   Ebisolo      Ebitonde
23     Swahili      Wazi       Sayansi

703. vaughn#

consultation: 37:54

  • voir

    • believes my listening, appreciation, and feel is advanced

  • savoir

    • however my thinking about chords, voicing of my left hand for organ need work

  • pourvoir

    • only then will my playing (pourvoir) match my feeling (voir) for music - its ‘bout that bridge (savoir)

704. coach#

submit my first piano video to him (see 6XX above)

705. ai#

AI  summary

Abimereki and Vaughn had a conversation about Abimereki's passion for music and his struggle to play the way he wants to feel. They discussed Abimereki's background in music, his processing methods, and the challenge of analyzing and expressing emotions through music.

0:05
Abimereki and Vaughn discussed Abimereki's struggles with playing chords on the piano and his desire to improve his left hand voicings. Vaughn explained the limitations of online instruction and the need for a comprehensive and efficient approach to learning.

8:57
Vaughn and Abimereki discussed Abimereki's passion for music and his desire to become a church musician, specifically playing the organ. Vaughn offered to help Abimereki achieve his goal by providing guidance and connections within the church community.

16:24
Vaughn discussed his plan to help Abimereki improve his musical skills by restructuring his thinking, filling in information gaps, and teaching him how to think creatively and apply new concepts. They also discussed the coaching process, where Abimereki would submit videos of his progress for evaluation and improvement.

25:14
Vaughn explained the benefits of group master classes and how they have helped students improve their skills in a shorter time period. Abimereki expressed interest in joining and discussed the option of financing the tuition fees.

07/16/2023#

706. openai#

  • walle USD 15

  • gpt4 USD 20

  • new subscriptions

  • squizout value!

707. hypothesis#

Hide code cell source
import matplotlib.pyplot as plt
import numpy as np

# Create data for the skill and challenge levels
skill_levels = np.linspace(0, 10, 100)
challenge_levels = np.linspace(0, 10, 100)

# Define the flow channel boundaries
flow_channel = skill_levels

# Adjust the phase and amplitude of the sinusoid wave
phase = np.pi / 16  # Reducing the wavelength by a quarter
amplitude = 1.5
flow_channel += np.sin(skill_levels + phase) * amplitude

# Define the yellow zone boundaries
yellow_zone_low = flow_channel - 1.5
yellow_zone_high = flow_channel + 1.5

# Define the sinusoid function with the middle yellow line as its axis
sinusoid = flow_channel + np.sin(skill_levels + phase) * amplitude

# Define the anxiety and boredom areas
anxiety_area = np.where(challenge_levels > flow_channel, challenge_levels, np.nan)
boredom_area = np.where(challenge_levels < flow_channel, challenge_levels, np.nan)

# Plotting
plt.figure(figsize=(8, 6))

# Plot the anxiety and boredom areas
plt.fill_between(skill_levels, flow_channel, 10, color='red', alpha=0.3, label='Anxiety', interpolate=True)
plt.fill_between(skill_levels, 0, flow_channel, color='green', alpha=0.3, label='Boredom', interpolate=True)
plt.fill_between(skill_levels, yellow_zone_low, yellow_zone_high, color='#FFCC00', alpha=0.3, label='Flow Channel', interpolate=True)

# Plot the sinusoid function
plt.plot(skill_levels, sinusoid, color='purple', linestyle='-')

# Add arrowhead to the sinusoid line (flipped direction)
plt.arrow(skill_levels[-2], sinusoid[-2], skill_levels[-1] - skill_levels[-2], sinusoid[-1] - sinusoid[-2],
          color='purple', length_includes_head=True, head_width=-0.15, head_length=-0.3)

# Plot the flow channel boundaries
plt.plot(skill_levels, flow_channel, color='#FFCC00', linestyle='-')

# Set plot labels and title
plt.xlabel('Credo')
plt.ylabel('Setting', rotation='horizontal', ha='right')  # Rotate the label horizontally

# Set plot limits and grid
plt.xlim(0, 10)
plt.ylim(0, 10)
plt.grid(True)

# Set tick labels
tick_labels = ['0', '2', '4', '6', '8', '10']
plt.xticks(np.linspace(0, 10, 6), tick_labels)
plt.yticks(np.linspace(0, 10, 6), tick_labels)

# Add text annotations to label the areas
plt.text(2.8, 7.9, 'art', color='black', ha='center', va='center', fontsize=12, bbox=dict(facecolor='pink', edgecolor='pink', boxstyle='round'))
plt.text(5, 5.1, 'engineering', color='black', ha='center', va='center', fontsize=12, bbox=dict(facecolor='#FFCC00', edgecolor='#FFCC00', boxstyle='round'))
plt.text(6.2, 2.1, 'science', color='black', ha='center', va='center', fontsize=12, bbox=dict(facecolor='lightgreen', edgecolor='lightgreen', boxstyle='round'))

# Display the plot
plt.show()
../_images/5facb4ced7238094c72c71949b764c503f56113fb038bb6a788bd42852502e46.png
  1. engineering

    • emergent flow channel

    • represents continuous never ending struggle

    • of use to artist and scientist alike

  2. art

    • ones credo is mostly insufficent [36]

    • but things magically sort themselves out

    • call it deus ex machina if you will!

  3. science

    • randomized trials create very narrow, unreal settings

    • so they are never generalizable

    • comforting to reaffirm ones credo

07/17/2023#

708. journaling#

  • unconvered how i’d reached my recent insights much, much earlier

  • so what then happened to them? why didn’t they leave a mark?

  • channeling sir falstaff: detraction will not suffer it!

709. conventions#

Hide code cell source
import pandas as pd

data = {
    'language': ['English', 'Spanish', 'French', 'German', 'Italian', 'Portuguese', 'Swedish', 'Norwegian', 
                 'Icelandish', 'Danish', 'Dutch', 'Latin', 'Russian', 'Chinese', 'Korean', 'Japanese', 
                 'Vietnamese', 'Arabic', 'Hindi', 'Gujarati', 'Zulu', 'Yoruba', 'Luganda', 'Swahili'],
    'note': ['Note', 'Nota', 'Note', 'Notiz', 'Nota', 'Nota', 'Anteckning', 'Notat', 
             'Minnispunktur', 'Notat', 'Notitie', 'Nota', 'Заметка', '笔记', '노트', 'ノート', 
             'Ghi chú', 'مذكرة', 'नोट', 'નોટ', 'Inothi', 'Akílẹ̀', 'Yiga', 'Kumbuka'],
    'pad': ['Pad', 'Bloc de notas', 'Bloc-notes', 'Notizblock', 'Blocco di note', 'Bloco de notas', 'Anteckningsblock', 'Notatblokk', 
            'Minnisbók', 'Notesbog', 'Notitieblok', 'Tabula', 'Блокнот', '记事本', '노트패드', 'ノートパッド', 
            'Sổ ghi chú', 'دفتر الملاحظات', 'नोटपैड', 'નોટપેડ', 'Ikhadi lokubhala', 'Padi apileko', 'Yiga empapula', 'Kitabu cha kumbukumbu'],
}

df = pd.DataFrame(data)
print(df)
      language           note                    pad
0      English           Note                    Pad
1      Spanish           Nota          Bloc de notas
2       French           Note             Bloc-notes
3       German          Notiz             Notizblock
4      Italian           Nota         Blocco di note
5   Portuguese           Nota         Bloco de notas
6      Swedish     Anteckning       Anteckningsblock
7    Norwegian          Notat             Notatblokk
8   Icelandish  Minnispunktur              Minnisbók
9       Danish          Notat               Notesbog
10       Dutch        Notitie            Notitieblok
11       Latin           Nota                 Tabula
12     Russian        Заметка                Блокнот
13     Chinese             笔记                    记事本
14      Korean             노트                   노트패드
15    Japanese            ノート                 ノートパッド
16  Vietnamese        Ghi chú             Sổ ghi chú
17      Arabic          مذكرة         دفتر الملاحظات
18       Hindi            नोट                 नोटपैड
19    Gujarati            નોટ                 નોટપેડ
20        Zulu         Inothi       Ikhadi lokubhala
21      Yoruba         Akílẹ̀           Padi apileko
22     Luganda           Yiga          Yiga empapula
23     Swahili        Kumbuka  Kitabu cha kumbukumbu

migrating my ‘note book’ journal to central place for all things python/jupyter and it will go by bloc de notas:

~/dropbox/1f.ἡἔρις,κ/1.ontology

710. afecdvi#

  • priorities

  • vision

  • edge

  1. grants

    • in-house

    • collabo

    • trainees

  2. science

    • transplant

    • surgery

    • aging

  3. workflow

    • only human error from algorithms

    • none from manual processes

    • content production

      • git: data, code, .html

      • terminal: jupyter-lab, rstudio, vscode

      • version-control: hub, repo, pages

  4. beauty

    • data visualization

    • publication quality

    • surpassing expectations

  5. pyras

    • python \(\cdot\) r \(\cdot\) ai \(\cdot\) stata

    • efficient, versatile, opensource

    • cost-neutral

713. graçias 🙏#

  • hires

  • flow

  • ambitions

  • collabos

  • ds4ph or serendipity

714. archives#

  • form 08/2011 - 04/2023

  • just completed task

  • inserted at 06/24/2023

715. bilowozo#

  • text edit document - 8 MB

  • cancerously grown

  • url will not load on my phone

  • migrate to new book

  • use dependencies or links

  • so that bilowozo never grows beyond 8KB

716. diagnosis#

Yours truly: I’ve a jupyter-book with a particular page that is .ipynb on the back end (8 MB in size). over the last 2-3 days it’s been giving me the following error when I try to access it on my iPhone 12 Pro Max: “Problem repeatedly occurred on “https://muzaale.github.io/book/bilowozo.html”. In other words, the webpage crashes each time I try to load it. This has never happened even once since I created it in April 2023. that said, the webpage is perfectly accessible on my iMac and so I’m wondering why I can’t access it on my iphone.

ChatGPT: There are several possible reasons why you may be experiencing this issue. Here are a few possibilities:

  1. Size and Complexity of the Notebook: This is the most likely issue given the size of your notebook. Mobile browsers are generally less powerful than their desktop counterparts. They have less memory, less processing power, and stricter limitations to ensure that a single page can’t drain the device’s battery. An 8 MB .ipynb file could easily exceed these limitations, especially once it’s converted to HTML for display. The complexity of the notebook also matters - if it contains complex visualizations or large data tables, that can increase the resources needed to render the page.

  2. Browser Issues: The mobile version of Safari (the default browser on iOS devices) sometimes has compatibility issues with certain webpages or JavaScript features. You can test this by trying to access the page from another browser, such as Chrome or Firefox, on your iPhone.

  3. Cache or Cookies: Sometimes, the browser cache or cookies can cause pages to fail to load correctly. You might try clearing your cache and cookies to see if that resolves the problem.

  4. Internet Connection: If your iPhone is experiencing slow internet or is on a less reliable network, it might struggle to load the page. You can test this by trying to load the page while connected to a different WiFi network or using cellular data.

To resolve the issue, you might try the following:

  • Simplify the notebook. Remove any unnecessary images, visualizations, or other complex elements.

  • Split the notebook into several smaller pages. This could make each individual page less resource-intensive, making them easier to load on a mobile browser.

  • If the page requires complex JavaScript to display the content, consider finding simpler ways to present the information.

  • Try accessing the page using a different browser on your iPhone.

  • Clear your browser’s cache and cookies.

  • Ensure that your iPhone is connected to a reliable network.

I hope this helps! Let me know if you have any other questions.

07/18/2023#

717. migration#

  • successful

  • insightful

  • experiential

718. unix#

To create .ipynb files named as “bdnYYYYMM.ipynb” for the period from January 2011 to December 2024 by copying the content of “bdn001.ipynb,” you can use the following Linux command:

for y in {2011..2024}; do
  for m in {01..12}; do
    cp bdn001.ipynb "bdn${y}${m}.ipynb"
  done
done

This command uses nested loops. The outer loop iterates through the years from 2011 to 2024, and the inner loop iterates through the months represented by {01..12}. The command copies the content of “bdn001.ipynb” to files named as “bdnYYYYMM.ipynb” for each combination of year and month.

719. r#

719.1 packages#

suppressWarnings({
  invisible({
    rm(list = ls())
    
    if (!require(icdpicr, quietly = TRUE, warn.conflicts = FALSE)) install.packages('icdpicr')
    if (!require(dplyr, quietly = TRUE, warn.conflicts = FALSE)) install.packages('dplyr')
    if (!require(readr, quietly = TRUE, warn.conflicts = FALSE)) install.packages('readr')
    if (!require(tidyr, quietly = TRUE, warn.conflicts = FALSE)) install.packages('tidyr')
    
    library(icdpicr, quietly = TRUE, warn.conflicts = FALSE)
    library(dplyr, quietly = TRUE, warn.conflicts = FALSE)
    library(readr, quietly = TRUE, warn.conflicts = FALSE)
    library(tidyr, quietly = TRUE, warn.conflicts = FALSE)
  })
})

 

719.2 documentation#

ls("package:icdpicr") 
> ls("package:icdpicr")
[1] "cat_trauma" "injury" 

719.3 syntax#

print(cat_trauma)
> print(cat_trauma)
function (df, dx_pre, icd10, i10_iss_method, calc_method = 1, 
    verbose = FALSE) 
{
    if (!is.data.frame(df)) 
        stop("First argument must be a dataframe")
    if (NROW(df) == 0) 
        stop("Data contains no observations. It must contain at least one row.")
    if (!is.character(dx_pre)) 
        stop("Second argument must be a character string")
    if (make.names(dx_pre) != dx_pre) 
        stop("Second argument must be a valid variable name in R")
    if (!(calc_method %in% c(1, 2))) 
        stop("calc_method must be either 1 or 2")
    if (!(icd10 %in% c(TRUE, FALSE, "cm", "base"))) 
        stop("icd10 must be TRUE, FALSE, 'cm', or 'base'")
    if (icd10 == FALSE) 
        i10_iss_method <- ""
    if (i10_iss_method == "roc_max") 
        stop("The roc_max option has been depricated. Please use roc_max_NIS, roc_max_TQIP, roc_max_NIS_only, or roc_max_TQIP_only instead.")
    if ((icd10 != FALSE) && !(i10_iss_method %in% c("roc_max_NIS", 
        "roc_max_TQIP", "roc_max_NIS_only", "roc_max_TQIP_only", 
        "gem_max", "gem_min"))) 
        stop("i10_iss_menthod must be roc_max_NIS, roc_max_TQIP, roc_max_NIS_only, roc_max_TQIP_only, gem_max, or gem_min.")
    regex_dx <- paste0("^", dx_pre, "([0-9]+)$")
    dx_colnames <- grep(regex_dx, names(df), value = TRUE)
    dx_nums <- as.numeric(sub(regex_dx, "\\1", dx_colnames))
    num_dx <- length(dx_nums)
    if (num_dx == 0) 
        stop("No variables with prefix found in data")
    df <- data.frame(df)
    if (isTRUE(icd10)) 
        icd10 <- "cm"
    if (icd10 %in% c("base", "cm")) {
        etab <- rbind(etab_s1, i10_ecode)
        ntab <- switch(i10_iss_method, roc_max_NIS = rbind(ntab_s1, 
            .select_i10_data("NIS", icd10)), roc_max_TQIP = rbind(ntab_s1, 
            .select_i10_data("TQIP", icd10)), roc_max_NIS_only = rbind(ntab_s1, 
            .select_i10_data("NIS_only", icd10)), roc_max_TQIP_only = rbind(ntab_s1, 
            .select_i10_data("TQIP_only", icd10)), gem_max = rbind(ntab_s1, 
            i10_map_max), gem_min = rbind(ntab_s1, i10_map_min))
    }
    else {
        ntab <- ntab_s1
        etab <- etab_s1
    }
    for (i in dx_nums) {
        dx_name <- paste0(dx_pre, i)
        df_ss <- df[, dx_name, drop = FALSE]
        df_ss$n <- 1:NROW(df_ss)
        df_ss[, dx_name] <- sub("\\.", "", df_ss[, dx_name])
        if (icd10 == TRUE & i10_iss_method == "roc_max") {
            i9_valid <- c("8", "9", "E")
            i10_valid <- c("S", "T", "U", "V", "W", "X", "Y")
            df_ss[, dx_name] <- ifelse(substr(df_ss[, dx_name], 
                1, 1) %in% c(i9_valid, i10_valid), df_ss[, dx_name], 
                NA)
            process_i10 <- function(s) {
                stopifnot(is.character(s) | is.na(s))
                ret_val <- NA
                s <- sub("\\.", "", s)
                if (!substr(s, 1, 1) %in% c("S", "T", "U", "V", 
                  "W", "X", "Y")) {
                  ret_val <- s
                }
                else if (nchar(s) < 7 & !grepl("X", substr(s, 
                  2, nchar(s)))) {
                  ret_val <- s
                }
                else if (nchar(s) != 7) {
                  ret_val <- ""
                }
                else if (substr(s, 7, 7) != "A") {
                  ret_val <- ""
                }
                else if (substr(s, 5, 5) == "X") {
                  ret_val <- substr(s, 1, 4)
                }
                else if (substr(s, 6, 6) == "X") {
                  ret_val <- substr(s, 1, 5)
                }
                else {
                  ret_val <- substr(s, 1, 6)
                }
                return(ret_val)
            }
            df_ss[, dx_name] <- sapply(df_ss[, dx_name], process_i10)
        }
        temp <- merge(df_ss, ntab, by.x = dx_name, by.y = "dx", 
            all.x = TRUE, all.y = FALSE, sort = FALSE)
        temp <- temp[order(temp$n), ]
        temp <- temp[, c("severity", "issbr")]
        if (calc_method == 2) {
            temp[which(temp$severity == 6), "severity"] <- 5
        }
        names(temp) <- paste0(c("sev_", "issbr_"), i)
        df <- .insert_columns(df, dx_name, temp)
    }
    body_regions <- unique(i10_map_max$issbr)
    issbr_names <- gsub("/", "_", body_regions)
    for (i in body_regions) {
        temp <- df[, grepl("sev_", names(df)), drop = FALSE] * 
            (1 * (df[, grepl("issbr_", names(df))] == i))
        df[, paste0("mxaisbr_", gsub("/", "", i))] <- apply(temp, 
            1, function(row) {
                row <- ifelse(row == 0, NA, row)
                if (all(is.na(row))) {
                  maxaisbr <- 0
                }
                else if (all(row == 9, na.rm = TRUE)) {
                  maxaisbr <- 9
                }
                else {
                  maxaisbr <- max(c(0, row[row != 9]), na.rm = TRUE)
                }
                return(maxaisbr)
            })
    }
    c9to0 <- function(x) ifelse(x == 9, 0, x)
    df$maxais <- apply(df, 1, function(row) {
        row <- row[grepl("mxaisbr", names(row))]
        if (all(is.na(row))) {
            maxais <- as.numeric(NA)
        }
        else if (max(c9to0(row), na.rm = TRUE) == 0) {
            maxais <- max(row, na.rm = TRUE)
        }
        else {
            maxais <- max(c9to0(row), na.rm = TRUE)
        }
        return(maxais)
    })
    df$maxais <- as.numeric(df$maxais)
    df$riss <- apply(df, 1, function(row) {
        temp <- row[grepl("^mxaisbr", names(row))]
        temp <- as.numeric(c9to0(temp))
        sum(temp[order(-temp)[1:3]]^2)
    })
    df[df$maxais == 6, "riss"] <- 75
    df[df$maxais == 9, "riss"] <- NA
    df$niss <- apply(df, 1, function(row) {
        temp <- row[grepl("^sev_", names(row))]
        temp <- as.numeric(temp)
        temp <- ifelse(is.na(temp) | temp == 9, 0, temp)
        sum(temp[order(-temp)[1:3]]^2)
    })
    df[df$maxais == 6, "niss"] <- 75
    df[df$maxais == 9, "niss"] <- NA
    ecode_colnames <- paste0("ecode_", 1:4)
    df[, ecode_colnames] <- NA
    ecode_regex <- paste0("^", etab$dx, collapse = "|")
    df[, ecode_colnames] <- t(apply(df, 1, function(row) {
        row <- sub("\\.", "", row)
        row_ecodes <- stringr::str_extract(as.character(unlist(row)), 
            ecode_regex)
        row_ecodes <- na.omit(row_ecodes)
        row_ecodes[1:4]
    }))
    for (i in 1:4) {
        col_name <- paste("ecode_", i, sep = "")
        df_ss <- df[, col_name, drop = FALSE]
        df_ss$n <- 1:NROW(df_ss)
        df_ss[, col_name] <- sub("\\.", "", df_ss[, col_name])
        temp <- merge(df_ss, etab, by.x = col_name, by.y = "dx", 
            all.x = TRUE, all.y = FALSE, sort = FALSE)
        temp <- temp[order(temp$n), ]
        temp <- temp[, c("mechmaj", "mechmin", "intent")]
        names(temp) <- paste(c("mechmaj", "mechmin", "intent"), 
            i, sep = "")
        df <- .insert_columns(df, col_name, temp)
    }
    if (stringr::str_detect(i10_iss_method, "NIS|TQIP") && icd10 %in% 
        c("cm", "base")) {
        if (verbose) 
            print("Calculating mortality prediction")
        coef_df <- .select_i10_coef(prefix = stringr::str_extract(i10_iss_method, 
            "NIS|TQIP"), icd10)
        stopifnot(max(coef_df$intercept, na.rm = TRUE) == min(coef_df$intercept, 
            na.rm = TRUE))
        intercept <- max(coef_df$intercept, na.rm = TRUE)
        coef_df <- coef_df[!is.na(coef_df$effect), ]
        effect_hash <- coef_df$effect
        names(effect_hash) <- coef_df$dx
        calc_mortality_prediction <- function(dx) {
            x <- sum(effect_hash[sub("\\.", "", dx)], na.rm = TRUE) + 
                intercept
            1/(1 + exp(-x))
        }
        mat <- as.matrix(df[, grepl(paste0("^", dx_pre), names(df))])
        df$Pmort <- apply(mat, 1, calc_mortality_prediction)
    }
    rownames(df) <- 1:nrow(df)
    df
}
<bytecode: 0x7fd33db28d30>
<environment: namespace:icdpicr>

719.4 dx_pre=”dx”#

print(injury)
> print(injury)
# A tibble: 100,477 × 11
   dx1      dx2      dx3      dx4      dx5      dx6      dx7      dx8      dx9      dx10      died
   <chr>    <chr>    <chr>    <chr>    <chr>    <chr>    <chr>    <chr>    <chr>    <chr>    <int>
 1 S72.342A NA       NA       NA       NA       NA       NA       NA       NA       NA           0
 2 S05.22XA NA       NA       NA       NA       NA       NA       NA       NA       NA           0
 3 S00.01XA S00.03XA S00.11XA S00.12XA S00.33XA S00.531A S00.83XA S01.511A S02.5XXA S80.11XA     0
 4 S21.119A NA       NA       NA       NA       NA       NA       NA       NA       NA           0
 5 S82.191A NA       NA       NA       NA       NA       NA       NA       NA       NA           0
 6 S22.42XA NA       NA       NA       NA       NA       NA       NA       NA       NA           0
 7 S92.052A S92.065A S92.325A S92.335A S92.345A S93.336A NA       NA       NA       NA           0
 8 S02.112A S06.5X0A S12.090A S12.100A S12.600A S20.211A S20.212A S22.010A S22.020A S22.038A     0
 9 S00.03XA S22.058A S22.068A S22.078A S22.41XA S30.1XXA S42.102A S62.647A S81.812A S82.202A     0
10 S61.411A S62.624B S62.626B S66.300A S66.302A S66.324A S66.326A NA       NA       NA           0
# ℹ 100,467 more rows
# ℹ Use `print(n = ...)` to see more rows

719.5 trial#

# use subset of injury given its size
inj = injury[1:100,1:3]

df_score = cat_trauma(df=inj,dx_pre="dx",icd10=TRUE,i10_iss_method="roc_max_NIS",calc_method = 1,verbose = FALSE) 

# visualize the output
df_score[1:9,1:9]; df_score[1:9,10:14]; df_score[1:9,15:18] ; df_score[1:9,34:35]
> df_score[1:9,1:9]; df_score[1:9,10:14]; df_score[1:9,15:18] ; df_score[1:9,34:35]
       dx1 sev_1     issbr_1      dx2 sev_2     issbr_2      dx3 sev_3     issbr_3
1 S72.342A     1 Extremities     <NA>    NA        <NA>     <NA>    NA        <NA>
2 S05.22XA     1        Face     <NA>    NA        <NA>     <NA>    NA        <NA>
3 S00.01XA     1   Head/Neck S00.03XA     2   Head/Neck S00.11XA     2        Face
4 S21.119A     3       Chest     <NA>    NA        <NA>     <NA>    NA        <NA>
5 S82.191A     3 Extremities     <NA>    NA        <NA>     <NA>    NA        <NA>
6 S22.42XA     3       Chest     <NA>    NA        <NA>     <NA>    NA        <NA>
7 S92.052A     1 Extremities S92.065A     1 Extremities S92.325A     1 Extremities
8 S02.112A     1        Face S06.5X0A     4   Head/Neck S12.090A     4   Head/Neck
9 S00.03XA     2   Head/Neck S22.058A     1       Chest S22.068A     3       Chest
  mxaisbr_General mxaisbr_HeadNeck mxaisbr_Face mxaisbr_Extremities mxaisbr_Chest
1               0                0            0                   1             0
2               0                0            1                   0             0
3               0                2            2                   0             0
4               0                0            0                   0             3
5               0                0            0                   3             0
6               0                0            0                   0             3
7               0                0            0                   1             0
8               0                4            1                   0             0
9               0                2            0                   0             3
  mxaisbr_Abdomen maxais riss niss
1               0      1    1    1
2               0      1    1    1
3               0      2    8    9
4               0      3    9    9
5               0      3    9    9
6               0      3    9    9
7               0      1    1    3
8               0      4   17   33
9               0      3   13   14
  intent4       Pmort
1    <NA> 0.013857916
2    <NA> 0.014412795
3    <NA> 0.017673701
4    <NA> 0.026078454
5    <NA> 0.027839150
6    <NA> 0.024364215
7    <NA> 0.005516156
8    <NA> 0.037222518
9    <NA> 0.021037152

07/19/2023#

720. dopamine#

  • less:

    • food

    • alcohol

    • weight (222.2 lbs)

  • more:

    • exercize

    • flow

    • sex

721. reconcile#

  • ms. acungura, aug xx, va

  • ms. peace, aug 26, otchy’s residence, va

  • ms. ssajjabbi, aug 28 - sep 4, ny

722. migration#

722. philosophy#

Yours Truly: quick and dirty on reality encode code decode counterfeit
ChatGPT:

  1. Reality: The state of things as they actually exist, irrespective of our subjective perceptions or interpretations.

  2. Encode: The process of converting data or a message into a specific format or system for secure transmission, processing, storage, or representation. This could be in the form of converting human language into computer code or converting a video into a compressed digital format.

  3. Code: A system of signals or symbols for communication. In computing, it refers to a set of instructions written by programmers to be executed by a computer. “Code” can also refer to a set of conventions or moral principles governing behavior.

  4. Decode: The process of converting an encoded format or system back into its original form. In media, this might mean converting a compressed digital video file into visible footage. In communication, decoding refers to interpreting and making sense of a received message.

  5. Counterfeit: An imitation or replica designed to deceive by appearing more valuable or genuine than it really is. This could be fake currency, knock-off brand products, or forged documents.

Combining the concepts: Reality can be represented and stored through the process of encoding, converting real-world events or data into a specific format or ‘code’. This ‘code’ can then be transmitted, stored, and eventually decoded to reproduce or understand the original information. However, just like tangible items can be counterfeited, so too can encoded information. This counterfeit data, once decoded, presents a skewed or false version of reality, leading to potential misconceptions or deceits.

723. afecdvi#

  • preface

    • beyond good & evil

    • you’re either better

    • or the fraud is better

    • in fact you can’t see fraud

    • unless you surpass the fraudster

  • reality

    • the source of our livelihood and misery

    • we seek to better understand it, tame it, utilize it

    • sentinel systems, data collection mechanisms, transmission, storage

  • encode

    • oneway

    • twoway

    • multivariable

    • hierarchical

    • pairs

      • python

      • ai

      • r

      • stata

  • code: dimentionality reduction from c(N) & c(k) -> r(mean), r(sd), etc.

    • nonparametric

      • central tendency

      • spread

      • p

    • parametric

      • central tendency

      • spread

      • p

    • semiparametric

      • central tendency

      • spread

      • p

  • decode

    • only parametric

    • can ‘reproduce’ original data

    • we call that simulation

    • random parametric distribution

  • fake

    • mastery of this process

    • can be used for representation

    • but can also be used to beguile

    • also used for encryption and transmission

    • portability of information with privacy

    • grant applications or policy decision

    • in lieu of data or with limited data

    • missing data methods and imputation

724. countefeiting#

Yours Truly: what are the uses and abuses of simulation or simulated data?
ChatGPT:

Simulations and simulated data play critical roles across various fields and industries. They can replicate complex systems or phenomena, making it possible to study scenarios that might be costly, dangerous, or even impossible to observe directly. However, as with many tools, simulations can be both beneficial and potentially misused.

Uses of Simulation or Simulated Data:

  1. Education and Training: Flight simulators train pilots, surgical simulators train doctors, and virtual labs teach scientific concepts to students.

  2. Predictive Modeling: Scientists and researchers use simulations to predict outcomes, such as the potential spread of diseases, the impact of climate change, or the behavior of complex systems under specific conditions.

  3. Product Design and Testing: Before manufacturing a product, companies often use simulations to test its design, reducing the need for expensive physical prototypes.

  4. Gaming and Entertainment: Simulations provide realistic environments in video games and virtual reality experiences.

  5. Financial Analysis: Simulations can predict market fluctuations and help in risk assessment.

  6. Urban Planning and Infrastructure: Simulated models can forecast the impact of new developments on traffic, pollution, and other urban factors.

  7. Safety Protocols: Simulations can be used to model disasters (like nuclear reactor meltdowns or chemical spills) to develop safety measures.

Abuses of Simulation or Simulated Data:

  1. Misrepresentation of Reality: Simulated data can be presented as genuine, misleading stakeholders or the public about real-world situations.

  2. Over-reliance: Assuming that a simulation can perfectly capture all aspects of reality can be a pitfall. This can lead to unanticipated outcomes if key variables or factors are overlooked.

  3. Confirmation Bias: Simulations can be tailored to produce desired outcomes, potentially confirming pre-existing beliefs rather than presenting objective results.

  4. Economic and Social Manipulation: Simulated data, when misrepresented, can be used to manipulate stock prices, political sentiments, or public opinion on critical issues.

  5. Ethical Concerns: Especially in medical or psychological research, the use of simulations might raise ethical issues, particularly if participants are not aware they’re interacting with simulated data or environments.

  6. Loss of Skills: Over-reliance on simulators, especially in professions like aviation, might lead to a decline in real-world skills or the ability to handle situations not covered in the simulation.

  7. Data Privacy and Security: Simulated data, especially if based on real-world individual data, might raise concerns about privacy and the potential misuse of information.

In conclusion, while simulations and simulated data offer a powerful means to understand, predict, and educate, they should be used judiciously and ethically, with a clear understanding of their limitations.

Yours Truly: please enrich this discussion and reproduce stuff.

  • mastery of this process

  • can be used for representation

  • but can also be used to beguile

  • also used for encryption and transmission

  • portability of information with privacy

  • grant applications or policy decision

  • in lieu of data or with limited data

  • missing data methods and imputation

Yours Truly:

Certainly! Let’s delve deeper into the utility and potential pitfalls of simulations and simulated data:

Mastery of Simulation Process:

  • Depth and Breadth: Achieving mastery in simulation requires understanding both the nuances of the specific system being modeled and the broader principles of modeling and simulation. This allows for the creation of detailed and accurate models that can predict real-world outcomes with high fidelity.

  • Iterative Refinement: Mastery also implies an ongoing process. Models are refined and retested as more data becomes available or as the real-world system changes.

Representation and Portability:

  • Clarity in Complexity: Complex phenomena can be broken down and represented in simulations, making them more understandable and accessible.

  • Portability of Information with Privacy: Simulation allows information to be abstracted. Real-world data can be converted into simulated data, ensuring the portability of insights without compromising individual privacy. This is especially crucial in areas like medical research where patient data confidentiality is paramount.

Uses in Critical Areas:

  • Grant Applications or Policy Decisions: In academic and policy circles, simulations can be used to predict the outcomes of proposed projects or policies. This predictive power can be persuasive in grant applications or in making data-driven policy decisions.

  • In Lieu of Data or with Limited Data: When direct data is scarce or difficult to obtain, simulations can fill in the gaps. For instance, in predicting the spread of a new disease, initial data might be limited, but simulations can help extrapolate potential future scenarios.

  • Missing Data Methods and Imputation: Simulated data can be especially useful when dealing with missing data points in a dataset. Advanced imputation methods, using simulations, can fill these gaps, making the data more complete and reliable.

Dual-Edged Sword of Simulations:

  • A Tool to Beguile: While simulations can illuminate, they can also deceive. Misleading simulations might be used to support a particular agenda or bias, giving a veneer of data-driven objectivity to skewed information.

  • Encryption and Transmission: Simulated data, due to its abstraction from raw data, can be used in secure data transmission. By simulating the characteristics of data without revealing the data itself, it can be encrypted and sent more securely.

  • Risks with Imputation: While filling in missing data through simulated methods can be useful, it’s also fraught with risk. If not done correctly, it can introduce bias or inaccuracies into the dataset.

Conclusion:

The domain of simulation and simulated data is rich with potential, but it’s a tool that demands responsibility. Its mastery can open doors to deeper insights, clearer communication of complex topics, and innovative solutions to pressing problems. Yet, the same power that makes it invaluable also makes it a tool that can mislead or misrepresent. Users and consumers of simulated data must approach it with a blend of appreciation, caution, and critical thinking.

725. darkchild#

i never worked a day in my life: 8:04/1:30:01

726. mjb#

so I’m playing the piano in g-minor and play a chord that includes a bass D note plus f#, B♭, c and f (but this note resolves to E♭), then the entire chord resolves to g-minor (home). what chords are those?

First Chord: D, F#, B♭, C, and F: This chord is a D7♭5#9:

  • D is the root.

  • F# is the major third.

  • B♭ (relative to D) serves as the diminished (or flat) sixth.

  • C is the dominant seventh.

  • F is the flat ninth.

Second Chord: D, F#, B♭, C, and E♭: This is similar to the first chord but has an additional E♭:

  • D remains the root.

  • F# remains the major third.

  • B♭ is the diminished sixth.

  • C remains the dominant seventh.

  • E♭ introduces another altered tone, which could be seen as an augmented ninth relative to D, though it’s a rare inclusion in such a context. (not at all rare in jazz and gospel)

Resolution to G minor (G, B♭, and D):

  • D remains as D, the fifth of G minor.

  • F# resolves down to G, the root of G minor.

  • B♭ remains as the minor third of G minor.

  • C has the tendency to resolve to B♭ or move to D.

  • E♭, being a tension note, can resolve to D.

In the context of moving to G minor:

  • The dominant nature of the D-rooted chords provides tension that naturally resolves to G minor.

  • The altered tones in the chords, like F and E♭, increase the tension, making the resolution to G minor more satisfying.

The progression showcases a sophisticated harmonic palette with tensions leading back to a familiar tonic.

07/20/2023#

727. stata#

  • week 1 content overview and approach; shouldn’t be stata-driven, rather concept-driven

  • oneway plot of age at transplant

  • scatterplot with overlayed mean & 95%CI

  • introduce jitter (stata program + manually program it)

  • have a oneword stata command with optional syntax to produce this plot

  • of course this introduces the idea of writing flexible stata programs

  • automate the process by which axis titles and plot text are extracted and displayed

  • back-of-the-envelope calculations as a critical idea for life and also for this class

  • ensure plot looks the same regardless of stata version you are running

728. chat#

What does hamlet mean when he talks of having “cause, will, strength, and means to do..”

In William Shakespeare’s play “Hamlet,” the phrase “cause, and will, and strength, and means” comes from Act IV, Scene IV, during Hamlet’s soliloquy after he witnesses the Norwegian army heading to battle over a trivial piece of land. This moment prompts Hamlet to reflect on his own inaction concerning avenging his father’s murder.

Here’s a breakdown:

  1. Cause: This refers to a valid reason. Hamlet has a genuine reason to act, given that his father was murdered.

  2. Will: This is the desire or intention to act. Hamlet feels the urge to seek vengeance for his father.

  3. Strength: The physical and mental capacity to act. Hamlet, being young and capable, possesses the strength to take revenge.

  4. Means: The resources or tools to carry out the act. As a prince, Hamlet has access to resources and opportunities to exact his revenge.

Despite having all these – a genuine reason, the desire, the capability, and the resources – Hamlet has been unable to act on his intentions. This soliloquy is a reflection of his deep frustration and self-reproach for his perceived inaction, especially when contrasted with the Norwegian army willing to fight and die for a cause that seems insignificant to him. He laments his own lack of action despite having more significant reasons to act.

Are these terms analogous to the verbs voir, savior, pourvoir?

The terms “cause, and will, and strength, and means” from Shakespeare’s “Hamlet” aren’t directly analogous to the French verbs “voir” (to see), “savoir” (to know), and “pourvoir” (to provide for or supply). However, if we want to draw parallels in a poetic or symbolic sense, we might come up with some interesting connections:

  1. Voir (to see): This could be linked to “cause,” in the sense that Hamlet “sees” or recognizes the injustice done to his father and the subsequent situation in the royal court. He has a clear vision of the wrongdoing.

  2. Savoir (to know): This can be associated with “will,” since Hamlet “knows” deep down what he should do, reflecting his internal understanding and conviction about avenging his father.

  3. Pourvoir (to provide for): This might be related to “means,” as it’s about having the resources or provisions to act. Hamlet, being the prince, has the means or resources at his disposal.

“Strength” doesn’t have a direct match in this set of French verbs, but the idea here is to draw symbolic connections rather than direct translations or exact analogies. So, while the verbs and the terms from Hamlet aren’t directly analogous, one can creatively interpret connections between them.

And that’s exactly what I was seeking, merci! Now get us a French term with poetic connection to means and ending phonetically in voir

Certainly! The French verb “prévoir” (to foresee, to anticipate, or to plan for) could fit your criteria. Phonetically, it ends in “voir” like the other verbs you mentioned.

Prévoir (to foresee/anticipate): This can be poetically linked to “means.” If one has the foresight or the ability to anticipate, they might be better prepared with the means to tackle a situation. Hamlet, in his contemplative nature, often tries to foresee the outcomes of his actions, which in turn influences the means or methods he might employ.

Merci

De rien! If you have any more questions, feel free to ask. Bonne journée!

729. accountability#

  • mark to be a partner with regard to diet, alcohol & exercize

  • yesterday i weighted in at 222 lbs

  • was goals (170 lbs) 12mo ago!

730. pairs#

  • python

  • ai

  • r

  • stata

my workflow treats these as ingredients rather than disparate, even competing or mutually exclusive platforms

this has been hastened by a confluence of realword factors:

  1. ds4ph

  2. ph340.600

  3. jupyter-book

  4. chatGPT

  5. realword needs

731. music#

  • life consists of the emotional states of flow, anxiety, and boredom

  • music evokes these through tension & release cycles

  • this is what makes it the most powerful of all human creations

  • it is a simulator of emotions without the drudgery and risks of life

  • we can enjoy it collectively in a dionysian spirit that say yes!! inspite of everything

732. app#

  • anticiper (prévoir)

  • participer (pourvoir)

  • précipiter (provoquer)

733. timecheck#

  • 1:29:45

  • struggle

  • deliverance [36]

  • reverence

734. constant#

  • energy (heraclitean hustle)

  • credo (platonic ideal) [36]

  • place (epicurean prudence)

735. essay#

and what are they. also, write a 500 word essay on these three characters and one must emerge as the winner. please don’t be dull, democratic, and overly sensible. give us a human touch with bias

Epicurus, Plato, and Heraclitus represent three prominent figures in ancient philosophy, each contributing unique perspectives and ideas to the field. In this essay, we will explore the philosophies of these three characters, their key concepts, and evaluate their contributions. Please note that the following essay will incorporate a human touch with bias.

Epicurus, known as an Epicurean philosopher, emphasized the pursuit of happiness and pleasure as the ultimate goal of life. He believed that pleasure should be sought, but not in the form of hedonistic indulgence. Epicurus advocated for a moderate and balanced approach, focusing on mental tranquility and the avoidance of pain. He taught that by attaining ataraxia (freedom from disturbance) and aponia (freedom from bodily pain), individuals could achieve a state of happiness. Epicurus valued friendship, simple living, and the absence of fear in his philosophy.

On the other hand, Plato, one of the most renowned philosophers in history, founded the Academy and proposed the theory of Forms. Plato believed that the physical world we perceive is merely a reflection of a higher reality of eternal and unchanging Forms. According to Plato, knowledge and truth are obtained through reason and contemplation of these abstract Forms. He viewed the pursuit of knowledge and the cultivation of virtue as the path to the Good, which he considered the highest aim of human life. Plato’s philosophy emphasized the importance of justice, wisdom, and the existence of an ideal state.

Lastly, Heraclitus, known for his enigmatic and paradoxical statements, held a radically different perspective. Heraclitus believed in the constant flux and change in the universe. He famously stated, “You cannot step into the same river twice.” According to Heraclitus, change and opposition are fundamental aspects of existence. He emphasized the concept of “Logos,” an underlying principle that governs the world, representing the interconnectedness and harmony within the perpetual flux. Heraclitus encouraged individuals to embrace change, to see it as a natural part of life, and to develop an understanding of the interconnectedness of all things.

Now, considering these three philosophies, it becomes a challenge to determine a clear winner. Each philosopher offers valuable insights and contributes to the rich tapestry of human thought. However, it is essential to acknowledge the biases inherent in any evaluation. In this context, I find myself drawn to the profound ideas of Plato.

Plato’s philosophy, with its focus on reason, virtue, and the pursuit of knowledge, resonates deeply. His theory of Forms provides a compelling framework for understanding the nature of reality and the human quest for truth. Plato’s emphasis on the ideal state, justice, and wisdom speaks to the desire for a just and harmonious society. The enduring influence of Plato’s ideas on subsequent philosophers and his impact on the development of Western philosophy further substantiate his significance.

While Epicurus and Heraclitus present intriguing perspectives, the broad applicability and timeless relevance of Plato’s ideas make him stand out. However, it is crucial to acknowledge that this evaluation is subjective and influenced by personal biases. Other individuals may find solace and inspiration in the teachings of Epicurus or the enigmatic wisdom of Heraclitus.

In conclusion, the philosophies of Epicurus, Plato, and Heraclitus offer distinct perspectives on life, happiness, and the nature of reality. While each philosopher contributes valuable insights, Plato’s emphasis on reason, virtue, and the pursuit of knowledge emerges as a powerful force. However, it is essential to recognize the subjectivity and bias inherent in this evaluation. Philosophy, like life itself, is complex and multifaceted, and the richness of human thought lies in the diversity of perspectives that contribute to our understanding of the world.

oh well, chatbot, my natural inclination is with epicurus. though i must confess that in my younger, better days i was without any doubt a splendid mishmash of heraclitus (music) & plato (anglican). i’m guessing, now that i think of it, that the tension (hymns vs babyface) ultimately resolved by choosing epicurus.

The tension you mention between Heraclitus and Plato, represented symbolically through “hymns vs babyface,” is intriguing. These two philosophers offer different ways of seeing the world—one rooted in change and process, and the other in eternal, unchanging ideals. Your eventual gravitation towards Epicurus might suggest a preference for a more grounded and pragmatic philosophy, one that focuses on the immediate pleasures and pains of life and the pursuit of happiness here and now.

Life is a continuous journey of self-discovery, and our beliefs and inclinations can change as we gather more experiences and insights. It’s always enlightening to reflect on our intellectual and spiritual paths and see how they shape our current worldview.

well spoken. by pragmatic i take it you mean proverbs 27:12 (ie a fixture in place or geography and avoidance of evil)

Thank you for your kind words.

Proverbs 27:12 from the Bible states: “The prudent see danger and take refuge, but the simple keep going and pay the penalty.”

This proverb emphasizes the value of foresight (prévoir) and prudence in the face of potential harm. It suggests that wisdom lies in recognizing threats and acting accordingly to avoid them, while those who lack such discernment face consequences.

When I referred to Epicurus as being “grounded and pragmatic,” I was speaking to his philosophy’s emphasis on real-world, tangible happiness and well-being, rather than metaphysical or abstract ideals. Epicurus advised living in a way that maximizes pleasure (understood as tranquility and the absence of pain) and minimizes suffering.

However, drawing a parallel between Epicurean pragmatism and the prudence highlighted in Proverbs 27:12 is an insightful connection. Both promote an awareness of one’s surroundings and circumstances, advocating for actions that lead to the best possible outcomes for oneself.

736. götzendämmerung#

Hide code cell source
import requests
from bs4 import BeautifulSoup
from tabulate import tabulate

# Send a request to the Wikipedia page
url = 'https://en.wikipedia.org/wiki/List_of_most-followed_Instagram_accounts'
response = requests.get(url)
soup = BeautifulSoup(response.text, 'html.parser')

# Find the table rows containing the data
table = soup.find('table', class_='wikitable')
rows = table.select('tbody tr')[1:]  # Skip the header row

# Extract data from each row and store it in a list
data = []
for row in rows:
    columns = row.find_all('td')
    
    if len(columns) >= 3:
        rank = columns[1].text.strip()
        owner = columns[3].text.strip()
        followers = columns[5].text.strip().replace('\xa0', '')  # Remove non-breaking spaces
        data.append([rank, owner, followers])

# Print the table using tabulate
headers = ["Name", "Followers", "Country"]
table = tabulate(data, headers=headers, tablefmt="plain")

print(table)
---------------------------------------------------------------------------
IndexError                                Traceback (most recent call last)
Cell In[13], line 22
     20         rank = columns[1].text.strip()
     21         owner = columns[3].text.strip()
---> 22         followers = columns[5].text.strip().replace('\xa0', '')  # Remove non-breaking spaces
     23         data.append([rank, owner, followers])
     25 # Print the table using tabulate

IndexError: list index out of range
Hide code cell source
def print_table(data):
    header = data[0]
    rows = data[1:]

    col_widths = [max(len(str(cell)) for cell in col) for col in zip(*data)]

    print("Name23".ljust(col_widths[0]), "Followers23".ljust(col_widths[1]), "Country")
    print("=" * sum(col_widths))

    for row in rows:
        print(str(row[0]).ljust(col_widths[0]), str(row[1]).ljust(col_widths[1]), str(row[2]))

# Hardcoded output from BeautifulSoup(response.text, 'html.parser') in 2023
data = [
    ["Name", "Followers", "Country"],
    ["Instagram", "649", "United States"],
    ["Cristiano Ronaldo", "597", "Portugal"],
    ["Lionel Messi", "480", "Argentina"],
    ["Selena Gomez", "426", "United States"],
    ["Kylie Jenner", "397", "United States"],
    ["Dwayne Johnson", "387", "United States"],
    ["Ariana Grande", "377", "United States"],
    ["Kim Kardashian", "362", "United States"],
    ["Beyoncé", "314", "United States"],
    ["Khloé Kardashian", "310", "United States"],
    ["Nike", "302", "United States"],
    ["Justin Bieber", "293", "Canada"],
    ["Kendall Jenner", "293", "United States"],
    ["National Geographic", "281", "United States"],
    ["Taylor Swift", "268", "United States"],
    ["Virat Kohli", "255", "India"],
    ["Jennifer Lopez", "249", "United States"],
    ["Nicki Minaj", "223", "Trinidad and TobagoUnited States"],
    ["Kourtney Kardashian", "223", "United States"],
    ["Miley Cyrus", "212", "United States"],
    ["Neymar", "210", "Brazil"],
    ["Katy Perry", "203", "United States"],
    ["Zendaya", "183", "United States"],
    ["Kevin Hart", "178", "United States"],
    ["Cardi B", "168", "United States"],
    ["LeBron James", "156", "United States"],
    ["Demi Lovato", "156", "United States"],
    ["Rihanna", "152", "Barbados"],
    ["Real Madrid CF", "143", "Spain"],
    ["Chris Brown", "142", "United States"],
    ["Drake", "140", "Canada"],
    ["Ellen DeGeneres", "139", "United States"],
    ["FC Barcelona", "122", "Spain"],
    ["Billie Eilish", "109", "United States"],
    ["UEFA Champions League", "107", "Europe"],
    ["Gal Gadot", "105", "Israel"],
    ["Kylian Mbappé", "105", "France"],
    ["Vin Diesel", "97.5", "United States"],
    ["Lisa", "95.9", "Thailand"],
    ["NASA", "94.3", "United States"],
    ["Dua Lipa", "88.6", "United KingdomAlbania"],
    ["Priyanka Chopra", "88.5", "India"],
    ["Shakira", "87.9", "Colombia"],
    ["NBA", "82.4", "United StatesCanada"],
    ["Shraddha Kapoor", "81.7", "India"],
    ["David Beckham", "80.8", "United Kingdom"],
    ["Jennie", "80.2", "South Korea"],
    ["Snoop Dogg", "80.2", "United States"],
    ["Khaby Lame", "79.7", "ItalySenegal"],
    ["Gigi Hadid", "78.8", "United States"]
]

print_table(data)
Hide code cell output
Name23                Followers23 Country
==============================================================
Instagram             649       United States
Cristiano Ronaldo     597       Portugal
Lionel Messi          480       Argentina
Selena Gomez          426       United States
Kylie Jenner          397       United States
Dwayne Johnson        387       United States
Ariana Grande         377       United States
Kim Kardashian        362       United States
Beyoncé               314       United States
Khloé Kardashian      310       United States
Nike                  302       United States
Justin Bieber         293       Canada
Kendall Jenner        293       United States
National Geographic   281       United States
Taylor Swift          268       United States
Virat Kohli           255       India
Jennifer Lopez        249       United States
Nicki Minaj           223       Trinidad and TobagoUnited States
Kourtney Kardashian   223       United States
Miley Cyrus           212       United States
Neymar                210       Brazil
Katy Perry            203       United States
Zendaya               183       United States
Kevin Hart            178       United States
Cardi B               168       United States
LeBron James          156       United States
Demi Lovato           156       United States
Rihanna               152       Barbados
Real Madrid CF        143       Spain
Chris Brown           142       United States
Drake                 140       Canada
Ellen DeGeneres       139       United States
FC Barcelona          122       Spain
Billie Eilish         109       United States
UEFA Champions League 107       Europe
Gal Gadot             105       Israel
Kylian Mbappé         105       France
Vin Diesel            97.5      United States
Lisa                  95.9      Thailand
NASA                  94.3      United States
Dua Lipa              88.6      United KingdomAlbania
Priyanka Chopra       88.5      India
Shakira               87.9      Colombia
NBA                   82.4      United StatesCanada
Shraddha Kapoor       81.7      India
David Beckham         80.8      United Kingdom
Jennie                80.2      South Korea
Snoop Dogg            80.2      United States
Khaby Lame            79.7      ItalySenegal
Gigi Hadid            78.8      United States
Hide code cell source
from tabulate import tabulate

data = [
    [1, "Instagram", "400 million"],
    [2, "Cristiano Ronaldo", "340 million"],
    [3, "Dwayne 'The Rock' Johnson", "245 million"],
    [4, "Kylie Jenner", "240 million"],
    [5, "Kim Kardashian", "230 million"],
    [6, "Lionel Messi", "220 million"],
    [7, "Beyoncé", "170 million"],
    [8, "Neymar Jr.", "160 million"],
    [9, "Taylor Swift", "150 million"],
    [10, "National Geographic", "150 million"],
    [11, "Justin Bieber", "150 million"],
    [12, "Kendall Jenner", "140 million"],
    [13, "Jennifer Lopez", "140 million"],
    [14, "Nicki Minaj", "140 million"],
    [15, "Nike", "130 million"],
    [16, "Miley Cyrus", "130 million"],
    [17, "Katy Perry", "130 million"],
    [18, "Khloé Kardashian", "130 million"],
    [19, "Rihanna", "120 million"],
    [20, "Kourtney Kardashian", "120 million"],
    [21, "Virat Kohli", "120 million"],
    [22, "NASA", "120 million"],
    [23, "Ellen DeGeneres", "100 million"],
    [24, "Real Madrid C.F.", "100 million"],
    [25, "FC Barcelona", "100 million"],
    [26, "LeBron James", "100 million"],
    [27, "Demi Lovato", "100 million"],
    [28, "Zlatan Ibrahimović", "95 million"],
    [29, "Shakira", "95 million"],
    [30, "Nicky Jam", "95 million"],
    [31, "Kevin Hart", "95 million"],
    [32, "NASA Goddard", "95 million"],
    [33, "Priyanka Chopra Jonas", "95 million"],
    [34, "9GAG", "95 million"],
    [35, "Cardi B", "95 million"],
    [36, "Ronaldinho", "95 million"],
    [37, "Emma Watson", "90 million"],
    [38, "Chris Brown", "90 million"],
    [39, "Selena Gomez", "90 million"],
    [40, "The Weeknd", "90 million"],
    [41, "Narendra Modi", "90 million"],
    [42, "Gareth Bale", "90 million"],
    [43, "FC Bayern Munich", "90 million"],
    [44, "Nike Football", "85 million"],
    [45, "Vin Diesel", "85 million"],
    [46, "Justin Timberlake", "85 million"],
    [47, "NASA Jet Propulsion Laboratory", "85 million"],
    [48, "David Beckham", "85 million"],
    [49, "Maluma", "85 million"],
    [50, "Billie Eilish", "85 million"]
]

headers = ["Rank21", "Name21", "Followers21"]

table = tabulate(data, headers=headers, tablefmt="plain")

print(table)
  Rank21  Name21                          Followers21
       1  Instagram                       400 million
       2  Cristiano Ronaldo               340 million
       3  Dwayne 'The Rock' Johnson       245 million
       4  Kylie Jenner                    240 million
       5  Kim Kardashian                  230 million
       6  Lionel Messi                    220 million
       7  Beyoncé                         170 million
       8  Neymar Jr.                      160 million
       9  Taylor Swift                    150 million
      10  National Geographic             150 million
      11  Justin Bieber                   150 million
      12  Kendall Jenner                  140 million
      13  Jennifer Lopez                  140 million
      14  Nicki Minaj                     140 million
      15  Nike                            130 million
      16  Miley Cyrus                     130 million
      17  Katy Perry                      130 million
      18  Khloé Kardashian                130 million
      19  Rihanna                         120 million
      20  Kourtney Kardashian             120 million
      21  Virat Kohli                     120 million
      22  NASA                            120 million
      23  Ellen DeGeneres                 100 million
      24  Real Madrid C.F.                100 million
      25  FC Barcelona                    100 million
      26  LeBron James                    100 million
      27  Demi Lovato                     100 million
      28  Zlatan Ibrahimović              95 million
      29  Shakira                         95 million
      30  Nicky Jam                       95 million
      31  Kevin Hart                      95 million
      32  NASA Goddard                    95 million
      33  Priyanka Chopra Jonas           95 million
      34  9GAG                            95 million
      35  Cardi B                         95 million
      36  Ronaldinho                      95 million
      37  Emma Watson                     90 million
      38  Chris Brown                     90 million
      39  Selena Gomez                    90 million
      40  The Weeknd                      90 million
      41  Narendra Modi                   90 million
      42  Gareth Bale                     90 million
      43  FC Bayern Munich                90 million
      44  Nike Football                   85 million
      45  Vin Diesel                      85 million
      46  Justin Timberlake               85 million
      47  NASA Jet Propulsion Laboratory  85 million
      48  David Beckham                   85 million
      49  Maluma                          85 million
      50  Billie Eilish                   85 million

737. survey#

Candidate Evaluation Form

1. What is your professional connection to the candidate?
- [ ] Peer, coworker, or colleague
- [ ] Former supervisor, manager, or professor
- [ ] Current supervisor, manager, or professor
- [ ] Friend (no past or current work relation)
- [ ] Subordinate
- [ ] Client
- [ ] Other

2. How current is your relationship to the candidate?
- [ ] Currently work or interact together
- [ ] Worked or interacted 5+ years ago

3. How long have you worked or interacted together?
- [ ] Less than 6 months
- [ ] Between 6 and 18 months
- [ ] Between 18 months and 3 years
- [ ] More than 3 years

4. Did you work...
- [ ] In direct contact?
- [ ] Sometimes on the same projects?
- [ ] Rarely on the same project?
- [ ] We have never worked together

5. If you had the opportunity, would you hire or recommend the candidate?
Scale: 1 to 7

6. How would you rate the overall quality of the candidate?
Scale: 1 to 7

7. Please evaluate the candidate in the following areas on scales of 1 to 7:
   1. Mastery of industry and product knowledge
   2. Relevant behaviors for success in the job
   3. Work-related skills
   4. Ability to display energy in all tasks
   5. Ability to make decisions in a timely manner
   6. Ability to execute and create results
   7. Ability to energize others
   8. Ability to respond to constructive feedback
   9. Attendance, punctuality
  10. Ability to develop relationships

8. What was the candidate's reason for leaving your organization?
_(Provide answer here)_

9. Please describe the candidate's primary duties and responsibilities while working for you.
_(Provide answer here)_

10. Please describe the candidate's interpersonal skills, including their ability to interact effectively with other coworkers and customers.
_(Provide answer here)_

11. What are the candidate's top strengths?
_(Provide answer here)_

12. In what areas could the candidate improve?
_(Provide answer here)_

13. Does the candidate have supervisory or management responsibility? If so, please describe their management style.
_(Provide answer here)_

14. Has the candidate ever been disciplined or discharged?
- [ ] Yes
- [ ] No

15. Do you believe the candidate to be a good fit for the position? Why or why not?
_(Provide answer here)_

16. Is there anything else you would like to share about the candidate?
_(Provide answer here)_

17. Rate her information:
- [ ] Phone number
- [ ] Additional information
- [ ] Employer history

738. passion#

st. matthew

  • st. matthew is the patron saint of accountants

  • bwv 244 is the passion according to st. matthew

  • no. 44: choral: befiehl du deine wege

  • hymn 181: passion chorale: o sacred head, now wounded (surrounded by the crown of thorns)

  • music: traditional secular melody in hans leo hassler’s passion chorale: Lustgarten Neapolitanischer Gesang (Naples, 1601)

  • harmonized by johann sebastian bach in 1729

739. merge#

Hide code cell source
from tabulate import tabulate

data21 = [
    ["Rank21", "Name21", "Followers21"],
    [1, "Instagram", "400 million"],
    [2, "Cristiano Ronaldo", "340 million"],
    [3, "Dwayne 'The Rock' Johnson", "245 million"],
    [4, "Kylie Jenner", "240 million"],
    [5, "Kim Kardashian", "230 million"],
    [6, "Lionel Messi", "220 million"],
    [7, "Beyoncé", "170 million"],
    [8, "Neymar Jr.", "160 million"],
    [9, "Taylor Swift", "150 million"],
    [10, "National Geographic", "150 million"],
    [11, "Justin Bieber", "150 million"],
    [12, "Kendall Jenner", "140 million"],
    [13, "Jennifer Lopez", "140 million"],
    [14, "Nicki Minaj", "140 million"],
    [15, "Nike", "130 million"],
    [16, "Miley Cyrus", "130 million"],
    [17, "Katy Perry", "130 million"],
    [18, "Khloé Kardashian", "130 million"],
    [19, "Rihanna", "120 million"],
    [20, "Kourtney Kardashian", "120 million"],
    [21, "Virat Kohli", "120 million"],
    [22, "NASA", "120 million"],
    [23, "Ellen DeGeneres", "100 million"],
    [24, "Real Madrid C.F.", "100 million"],
    [25, "FC Barcelona", "100 million"],
    [26, "LeBron James", "100 million"],
    [27, "Demi Lovato", "100 million"],
    [28, "Zlatan Ibrahimović", "95 million"],
    [29, "Shakira", "95 million"],
    [30, "Nicky Jam", "95 million"],
    [31, "Kevin Hart", "95 million"],
    [32, "NASA Goddard", "95 million"],
    [33, "Priyanka Chopra Jonas", "95 million"],
    [34, "9GAG", "95 million"],
    [35, "Cardi B", "95 million"],
    [36, "Ronaldinho", "95 million"],
    [37, "Emma Watson", "90 million"],
    [38, "Chris Brown", "90 million"],
    [39, "Selena Gomez", "90 million"],
    [40, "The Weeknd", "90 million"],
    [41, "Narendra Modi", "90 million"],
    [42, "Gareth Bale", "90 million"],
    [43, "FC Bayern Munich", "90 million"],
    [44, "Nike Football", "85 million"],
    [45, "Vin Diesel", "85 million"],
    [46, "Justin Timberlake", "85 million"],
    [47, "NASA Jet Propulsion Laboratory", "85 million"],
    [48, "David Beckham", "85 million"],
    [49, "Maluma", "85 million"],
    [50, "Billie Eilish", "85 million"]
]

data23 = [
    ["Name23", "Followers23", "Country"],
    ["Instagram", "649", "United States"],
    ["Cristiano Ronaldo", "597", "Portugal"],
    ["Lionel Messi", "480", "Argentina"],
    ["Selena Gomez", "426", "United States"],
    ["Kylie Jenner", "397", "United States"],
    ["Dwayne Johnson", "387", "United States"],
    ["Ariana Grande", "377", "United States"],
    ["Kim Kardashian", "362", "United States"],
    ["Beyoncé", "314", "United States"],
    ["Khloé Kardashian", "310", "United States"],
    ["Nike", "302", "United States"],
    ["Justin Bieber", "293", "Canada"],
    ["Kendall Jenner", "293", "United States"],
    ["National Geographic", "281", "United States"],
    ["Taylor Swift", "268", "United States"],
    ["Virat Kohli", "255", "India"],
    ["Jennifer Lopez", "249", "United States"],
    ["Nicki Minaj", "223", "Trinidad and Tobago, United States"],
    ["Kourtney Kardashian", "223", "United States"],
    ["Miley Cyrus", "212", "United States"],
    ["Neymar", "210", "Brazil"],
    ["Katy Perry", "203", "United States"],
    ["Zendaya", "183", "United States"],
    ["Kevin Hart", "178", "United States"],
    ["Cardi B", "168", "United States"],
    ["LeBron James", "156", "United States"],
    ["Demi Lovato", "156", "United States"],
    ["Rihanna", "152", "Barbados"],
    ["Real Madrid CF", "143", "Spain"],
    ["Chris Brown", "142", "United States"],
    ["Drake", "140", "Canada"],
    ["Ellen DeGeneres", "139", "United States"],
    ["FC Barcelona", "122", "Spain"],
    ["Billie Eilish", "109", "United States"],
    ["UEFA Champions League", "107", "Europe"],
    ["Gal Gadot", "105", "Israel"],
    ["Kylian Mbappé", "105", "France"],
    ["Vin Diesel", "97.5", "United States"],
    ["Lisa", "95.9", "Thailand"],
    ["NASA", "94.3", "United States"],
    ["Dua Lipa", "88.6", "United Kingdom, Albania"],
    ["Priyanka Chopra", "88.5", "India"],
    ["Shakira", "87.9", "Colombia"],
    ["NBA", "82.4", "United States, Canada"],
    ["Shraddha Kapoor", "81.7", "India"],
    ["David Beckham", "80.8", "United Kingdom"],
    ["Jennie", "80.2", "South Korea"],
    ["Snoop Dogg", "80.2", "United States"],
    ["Khaby Lame", "79.7", "Italy, Senegal"],
    ["Gigi Hadid", "78.8", "United States"]
]

# Combine the data
combined_data = []
for i in range(1, 51):
    row_data21 = data21[i]
    row_data23 = data23[i]
    combined_row = row_data21 + row_data23
    combined_data.append(combined_row)

# Display the combined table
print(tabulate(combined_data, headers=["Rank21", "Name21", "Followers21", "Name23", "Followers23", "Country"]))
  Rank21  Name21                          Followers21    Name23                   Followers23  Country
--------  ------------------------------  -------------  ---------------------  -------------  ----------------------------------
       1  Instagram                       400 million    Instagram                      649    United States
       2  Cristiano Ronaldo               340 million    Cristiano Ronaldo              597    Portugal
       3  Dwayne 'The Rock' Johnson       245 million    Lionel Messi                   480    Argentina
       4  Kylie Jenner                    240 million    Selena Gomez                   426    United States
       5  Kim Kardashian                  230 million    Kylie Jenner                   397    United States
       6  Lionel Messi                    220 million    Dwayne Johnson                 387    United States
       7  Beyoncé                         170 million    Ariana Grande                  377    United States
       8  Neymar Jr.                      160 million    Kim Kardashian                 362    United States
       9  Taylor Swift                    150 million    Beyoncé                        314    United States
      10  National Geographic             150 million    Khloé Kardashian               310    United States
      11  Justin Bieber                   150 million    Nike                           302    United States
      12  Kendall Jenner                  140 million    Justin Bieber                  293    Canada
      13  Jennifer Lopez                  140 million    Kendall Jenner                 293    United States
      14  Nicki Minaj                     140 million    National Geographic            281    United States
      15  Nike                            130 million    Taylor Swift                   268    United States
      16  Miley Cyrus                     130 million    Virat Kohli                    255    India
      17  Katy Perry                      130 million    Jennifer Lopez                 249    United States
      18  Khloé Kardashian                130 million    Nicki Minaj                    223    Trinidad and Tobago, United States
      19  Rihanna                         120 million    Kourtney Kardashian            223    United States
      20  Kourtney Kardashian             120 million    Miley Cyrus                    212    United States
      21  Virat Kohli                     120 million    Neymar                         210    Brazil
      22  NASA                            120 million    Katy Perry                     203    United States
      23  Ellen DeGeneres                 100 million    Zendaya                        183    United States
      24  Real Madrid C.F.                100 million    Kevin Hart                     178    United States
      25  FC Barcelona                    100 million    Cardi B                        168    United States
      26  LeBron James                    100 million    LeBron James                   156    United States
      27  Demi Lovato                     100 million    Demi Lovato                    156    United States
      28  Zlatan Ibrahimović              95 million     Rihanna                        152    Barbados
      29  Shakira                         95 million     Real Madrid CF                 143    Spain
      30  Nicky Jam                       95 million     Chris Brown                    142    United States
      31  Kevin Hart                      95 million     Drake                          140    Canada
      32  NASA Goddard                    95 million     Ellen DeGeneres                139    United States
      33  Priyanka Chopra Jonas           95 million     FC Barcelona                   122    Spain
      34  9GAG                            95 million     Billie Eilish                  109    United States
      35  Cardi B                         95 million     UEFA Champions League          107    Europe
      36  Ronaldinho                      95 million     Gal Gadot                      105    Israel
      37  Emma Watson                     90 million     Kylian Mbappé                  105    France
      38  Chris Brown                     90 million     Vin Diesel                      97.5  United States
      39  Selena Gomez                    90 million     Lisa                            95.9  Thailand
      40  The Weeknd                      90 million     NASA                            94.3  United States
      41  Narendra Modi                   90 million     Dua Lipa                        88.6  United Kingdom, Albania
      42  Gareth Bale                     90 million     Priyanka Chopra                 88.5  India
      43  FC Bayern Munich                90 million     Shakira                         87.9  Colombia
      44  Nike Football                   85 million     NBA                             82.4  United States, Canada
      45  Vin Diesel                      85 million     Shraddha Kapoor                 81.7  India
      46  Justin Timberlake               85 million     David Beckham                   80.8  United Kingdom
      47  NASA Jet Propulsion Laboratory  85 million     Jennie                          80.2  South Korea
      48  David Beckham                   85 million     Snoop Dogg                      80.2  United States
      49  Maluma                          85 million     Khaby Lame                      79.7  Italy, Senegal
      50  Billie Eilish                   85 million     Gigi Hadid                      78.8  United States

This includes several inaccuracies. But for starters the code is of interest to me as a Python novice. Later we’ll investigate how to debug and improve it (for instance: everything is hardcoded but we need to scrape old websites to capture this information).

740. gouldstein#

bellissimo

741. bernstein!#

742. khloe#

  • why is Khloe kadashian gaining so much ground in instagram popularity?
    \(\vdots\)

  • give an answer with more grit
    \(\vdots\)

  • much better. lets try this with even more grit!!! also, why has she gained more ground than the rest of the family? she isn’t the most followed (she is #3 in the family). but in terms of position, she has gained the most

While I can provide some speculative reasons for Khloe Kardashian’s individual growth on Instagram compared to other family members, it’s important to note that this information is based on my knowledge cutoff in September 2021 and may not reflect the current situation.

  1. Authenticity and Relatability: Khloe Kardashian has often been perceived as more down-to-earth and relatable compared to her sisters, who have a more glamorous and high-profile image. This relatability factor might attract a wider audience and resonate with people who find her more relatable or approachable.

  2. Personal Transformation: Khloe Kardashian has undergone significant physical transformations over the years, which have been well-documented in the media and on social platforms. This journey of personal growth, including her fitness and wellness endeavors, may inspire and resonate with individuals who are on similar journeys or seeking motivation for self-improvement.

  3. Emphasis on Body Positivity: Khloe Kardashian has been vocal about body positivity and self-acceptance. Her openness about struggles with body image and her efforts to promote self-love may resonate with people who appreciate her vulnerability and messaging.

  4. Engaging Content Strategy: Khloe Kardashian’s content strategy might be contributing to her individual growth on Instagram. By sharing a mix of personal and professional content, she strikes a balance that appeals to a wider range of followers. Her posts often include glimpses into her personal life, fashion-related content, beauty tips, and family moments. This diverse content strategy may help her appeal to a broader audience.

  5. Targeting Niche Audiences: It’s possible that Khloe Kardashian has found success in targeting specific niche audiences on Instagram. By focusing on specific topics such as fitness, motherhood, or body positivity, she may have attracted followers who have a strong interest in those areas. This targeted approach can result in a more engaged and dedicated following.

Again, it’s important to emphasize that these are speculative reasons based on past observations. The dynamics of social media popularity can change over time, and the current landscape might be different from what was prevalent during my knowledge cutoff in September 2021.

743. paulsimon#

philospher

744. mankind#

  1. Fix your attitude (i.e., burn as many calories as is necessary to getrrr done) [36]

    • No credo

    • No settings

  2. Fix your credo (i.e., your god, ideals, and values are unchanging)

    • Change settings to reflect your credo

    • Change attitude to reflect your credo

  3. Fix your settings (i.e., your environment is unchanging)

    • Have no credo and be somewhat agnostic

    • But gradually master the limited environmental preferences you’ve restricted yourself to.

745. clock#

  • study this code

  • what language is it?

  • and why so long?

  • used in timecheck

745. co-pilot#

  • signed up for github co-pilot a few days ago

  • can’t remember exactly when and whether it cost me anything

  • but its today that i’ve seen it in action & am loving it

  • it is a game changer for me and reinforces my belief in ai

  • also clear to me that away from ide like vscode, one will miss out on the benefits of ai

746. livedonation#

  • reverence

  • prudence

  • epicurean

  • commitment

  • dominion

07/21/2023#

747. record#

  • june 2023 hottest month on record

  • 07/20/2023 baddest journaling day on record [52]

  • i’m writing this in the middle of the night (on 07/20/2023)

748. science#

Hide code cell source
import matplotlib.pyplot as plt
import numpy as np

# Create data for the skill and challenge levels
skill_levels = np.linspace(0, 10, 100)
challenge_levels = np.linspace(0, 10, 100)

# Define the flow channel boundaries
flow_channel = skill_levels

# Adjust the phase and amplitude of the sinusoid wave
phase = np.pi / 16  # Reducing the wavelength by a quarter
amplitude = 1.5
flow_channel += np.sin(skill_levels + phase) * amplitude

# Define the yellow zone boundaries
yellow_zone_low = flow_channel - 1.5
yellow_zone_high = flow_channel + 1.5

# Define the sinusoid function with the middle yellow line as its axis
sinusoid = flow_channel + np.sin(skill_levels + phase) * amplitude

# Define the anxiety and boredom areas
anxiety_area = np.where(challenge_levels > flow_channel, challenge_levels, np.nan)
boredom_area = np.where(challenge_levels < flow_channel, challenge_levels, np.nan)

# Plotting
plt.figure(figsize=(8, 6))

# Plot the anxiety and boredom areas
plt.fill_between(skill_levels, flow_channel, 10, color='red', alpha=0.3, label='Anxiety', interpolate=True)
plt.fill_between(skill_levels, 0, flow_channel, color='green', alpha=0.3, label='Boredom', interpolate=True)
plt.fill_between(skill_levels, yellow_zone_low, yellow_zone_high, color='#FFCC00', alpha=0.3, label='Flow Channel', interpolate=True)

# Plot the sinusoid function
plt.plot(skill_levels, sinusoid, color='purple', linestyle='-')

# Add arrowhead to the sinusoid line (flipped direction)
plt.arrow(skill_levels[-2], sinusoid[-2], skill_levels[-1] - skill_levels[-2], sinusoid[-1] - sinusoid[-2],
          color='purple', length_includes_head=True, head_width=-0.15, head_length=-0.3)

# Plot the flow channel boundaries
plt.plot(skill_levels, flow_channel, color='#FFCC00', linestyle='-')

# Set plot labels and title
plt.xlabel('State-of-the-art')
plt.ylabel('Population', rotation='horizontal', ha='right')  # Rotate the label horizontally

# Set plot limits and grid
plt.xlim(0, 10)
plt.ylim(0, 10)
plt.grid(True)

# Set tick labels
tick_labels = ['0', '2', '4', '6', '8', '10']
plt.xticks(np.linspace(0, 10, 6), tick_labels)
plt.yticks(np.linspace(0, 10, 6), tick_labels)

# Add text annotations to label the areas
plt.text(2.8, 7.9, 'bigdata', color='black', ha='center', va='center', fontsize=12, bbox=dict(facecolor='pink', edgecolor='pink', boxstyle='round'))
plt.text(5, 5.1, 'cohort', color='black', ha='center', va='center', fontsize=12, bbox=dict(facecolor='#FFCC00', edgecolor='#FFCC00', boxstyle='round'))
plt.text(6.2, 2.1, 'trial', color='black', ha='center', va='center', fontsize=12, bbox=dict(facecolor='lightgreen', edgecolor='lightgreen', boxstyle='round'))

# Display the plot
plt.show()
../_images/daebd0a23483ea51419a792d647dd68373117240b6c5aba06430ecdd73f2409f.png
  • psychology

  • sociology

  • anthropology

  • economics

  • political science

  • history

  • philosophy

  • theology

  • literature

  • music

  • art

  • architecture

  • engineering

  • mathematics

  • physics

  • chemistry

  • biology

  • medicine

  • computer science

  • data science

  • statistics

  • epidemiology

  • public health

  • public policy

  • law

  • business

  • finance

  • accounting

  • marketing

  • management

  • operations

  • supply chain

  • logistics

  • transportation

  • communication

  • journalism

  • education

  • linguistics

  • geography

  • geology

  • astronomy

  • cosmology

  • astrology

  • meteorology

  • oceanography

  • ecology

  • environmental science

  • agriculture

  • forestry

  • fisheries

  • food science

  • nutrition

  • culinary arts

  • sports

  • recreation

  • leisure

  • travel

  • tourism

  • hospitality

  • fashion

  • textiles

  • interior design

  • industrial design

  • product design

  • graphic design

  • animation

  • film

  • photography

  • theater

  • dance

  • choreography

  • music

  • literature

  • poetry

  • prose

  • fiction
    \(\vdots\)

749. references#

  1. Here’s one style:

# In _config.yml
sphinx:
  config:
    bibtex_reference_style: author_year
  1. Here’s another style:

Here’s some text and here’s my label.

  1. Here’s a third style:

  • I’ve only been able to use the author_year style.

  • I’ve tried unsrt, numbered and plain but they don’t work.

  • Even author_year doesn’t work for me in the sphinx config as I’d like it to.

  • So how do I get the author_year style to work in the sphinx config?

07/22/2023#

750. markedlystructuredtext#

we’re now ready to add the following features: [53]

  • bibtext

  • myst

  • ?

Note

Here is a note

Tip

Here is a tip for you

This is my admonition

This is my note

10a = 2
11print('my 1st line')
12print(f'my {a}nd line')
bloc/img/fun-fish.png

Party time!#

A reference from inside: Party time!

A reference from outside: syntax/directives/parsing

Note

The next info should be nested

Warning

Here’s my warning

Warning

Here’s my raw text warning that isn’t parsed…

Since Pythagoras, we know that \(a^2 + b^2 = c^2\)

()#\[e^{i\pi} + 1 = 0\]

Euler’s identity, equation (), was elected one of the most beautiful mathematical formulas.

Jun 04, 2024 | 11517 words | 58 min read